It may not be too long before autonomous vehicles rule the roads. Despite a few widely reported mishaps, autonomous vehicles (AVs) are considered as safe or safer than human drivers in many respects. Purdue and Michigan State researchers took an innovative approach they termed “Heat-assisted detection and ranging.”
From an article in Tech Xplore by Peter Grad.
They’re equipped with radar-detection and 360-degree cameras that are not impacted by a poor night’s sleep, a cellphone call or texting.
AVs have a built-in database of streets and highways, traffic lights, speed limits and various other pertinent details of the rules and regulations governing practically every foot of American roadways.
Still, AVs have their drawbacks. One of them was the focus of a recent study by researchers at Purdue University and Michigan State University.
AVs “see” the road through sonar, radar and LiDAR technology. LiDAR—Light Detection and Ranging—employs laser beams to determine the distance between two objects.
This visual apparatus does a largely admirable job for road navigation, but it has a key limitation.
As Zubin Jacob, a researcher at Purdue University, says, “Objects and their environment constantly emit and scatter thermal radiation, leading to textureless images famously known as the ‘ghosting effect.'”
Accurate detection of objects in real time is essential to guarantee accident-avoidance measures are taken in an instant. Auto maneuvers based on incorrect assessments drawn from blurry images, or ghosting, can make the difference between life and death.
Alternate means of improving detection have not been successful. High-resolution cameras, for instance, seemed promising but can falter when lighting is insufficient. Other multi-instrument approaches faced problems arising from data transmission interference.
But the Purdue and Michigan State researchers took an innovative approach they termed “Heat-assisted imaging and ranging.” The paper appeared in the journal Nature July 26.
With the application of machine learning algorithms and approaches called TeX decomposition and TeX vision, the researchers were able to eliminate barriers posed by darkness, fog and smoke and clearly capture images with infrared cameras.
Currently used modalities such as sonar, radar and LiDAR “send out signals and detect the reflection to infer the presence/absence of any object and its distance,” explained Jacob. “This gives extra information of the scene in addition to the camera vision, especially when the ambient illumination is poor.”
His team’s approach—heat-assisted detection and ranging (HADAR)—is “fundamentally different,” he said.
For the complete article CLICK HERE.
Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to editor@lidarnews.com and if you would like to join the Younger Geospatial Professional movement click here