Despite the efforts of leading OEMs, fully autonomous vehicles (“Level 4”) have not yet arrived. While the industry is still working to get Level 4 vehicles on the road, there has been increasing consumer interest in better advanced driver assistance systems (ADAS), even for vehicles with lower levels of autonomy.
From an article in Wards Auto by Jun Pei, Cepton Co-founder.
Lidar, a technology previously associated with fully autonomous vehicles, is taking an increasingly important role in vehicle safety in ADAS. In fact, consumer vehicles can start benefiting from lidar today, with their superior perception capabilities proven to help significantly reduce both the frequency and severity of accidents.
Current sensor technologies, including camera and radar, have been supporting ADAS for several vehicle generations, proving to be potent in many scenarios. However, just by nature of how they work, they have several limitations that prevent them from delivering accurate, error-proof perception capabilities thereby leaving a significant risk of accidents.
Let’s start with cameras. One of their biggest limitations is their reliance on good lighting and environment. Their performance gets negatively impacted at night, in direct sunlight and in inclement weather. Another limitation is that they provide 2D imaging, making it difficult to determine the accurate 3D size, location and velocity of objects on the road.
Despite the efforts of leading OEMs, fully autonomous (“Level 4”) cars have not yet arrived. While the industry is still working to get Level 4 vehicles on the road, there has been increasing consumer interest in better advanced driver assistance systems (ADAS), even for vehicles with lower levels of autonomy.
Lidar, a technology previously associated with fully autonomous vehicles, is taking an increasingly important role in vehicle safety in ADAS. In fact, consumer vehicles can start benefiting from lidar today, with their superior perception capabilities proven to help significantly reduce both the frequency and severity of accidents.
Current sensor technologies, including camera and radar, have been supporting ADAS for several vehicle generations, proving to be potent in many scenarios. However, just by nature of how they work, they have several limitations that prevent them from delivering accurate, error-proof perception capabilities thereby leaving a significant risk of accidents.
Let’s start with cameras. One of their biggest limitations is their reliance on good lighting and environment. Their performance gets negatively impacted at night, in direct sunlight and in inclement weather. Another limitation is that they provide 2D imaging, making it difficult to determine the accurate 3D size, location and velocity of objects on the road.
This can be compensated to a certain degree with the development of computer vision, but current algorithms have many limitations.
Radar is a 3D sensing technology and performs well under various lighting and environmental conditions, but lacks the spatial resolution needed for accurate object detection, tracking and classification.
For instance, radars might be able to tell a driver that there is an object a few hundred feet down the road, but cannot provide data accurately enough to help determine what type of object it is, or whether it is in the same lane as the driver or, in fact, on the shoulder of the road.
For the complete article CLICK HERE.
Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to editor@lidarnews.com and if you would like to join the Younger Geospatial Professional movement click here.