3D Modeling AI Autonomous vehicles Laser Scanning Lidar safety Surveying Technology

First Serious Australian Autopilot Accident

point cloud of bridge First Serious Australian Autopilot Accident - credit Baraja
First Serious Australian Autopilot Accident - credit Baraja

The first serious Australian autopilot accident involving a self-driving car was recorder in March. A pedestrian was critically injured when struck by a Tesla in “autopilot” mode. US safety regulators are also reviewing the technology after a series of accidents in which autonomous cars drove straight into emergency services vehicles – apparently dazzled by their flashing lights.

From an article in Cosmos by Jamie Seidel.

It’s not uncommon for humans to make similar mistakes – accidents happen – but machines shouldn’t have human failings.

In the autonomous vehicle trade, these accidents are called “edge cases”. It’s where something is unexpected, unusual, or outside artificial intelligence (AI) training parameters. And that’s the challenge facing the autonomous vehicle industry.

Paying attention

Despite all expectations, human drivers cannot yet become passive passengers. Instead, they must have eyes on the road and hands on the wheel, even if the car is driving itself.

But that scenario is subject to human failings, with the temptation to respond to an SMS, read a paper, or stare at the surroundings heightened.

When autonomous vehicles crash, it’s often when the driver isn’t paying attention.

“In most cases, autonomous vehicles can easily understand the world around them,” says Cibby Pulikkaseril, founder and chief technology officer of Australian automotive sensor manufacturer Baraja.

“But, when multiple objects are partially obscured, or there are adversarial targets in the environment, the perception of the vehicle may have difficulty understanding the scene.”

The key, he says, is ensuring the AI behind the wheel has persistent, accurate and diverse information.

In human drivers, impaired senses and poor decision making can be deadly. The same also applies to signal noise, pattern recognition and available processing power. But inferring the cause of an AI’s error is harder than identifying human failings. How many cameras are enough? Do you need to add radar? Does LiDAR – similar to radar, but using lasers instead of radio beams – make both redundant?

And does an AI have sufficient input to recognise what’s going on in the real world around it?

For the complete article on the first serious Australian autopilot accident CLICK HERE.

Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to editor@lidarnews.com and if you would like to join the Younger Geospatial Professional movement click here

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: