3D Modeling AI AR/VR Autonomous vehicles Lidar Technology

Machine Perception and Lidar

point cloud of Machine Perception and Lidar
Machine Perception and Lidar

In an article for EE Times Stefani Munoz explains that, “As AI and the physical world intersect and adoption of autonomous technologies, such as machine perception increases, one might question how machines and their currently brittle models could possibly perceive the world in ways humans do. With the help of sensor technologies such as those implemented in self-driving vehicles, including lidar, radar and cameras, machines are beginning to gather real-time data to inform decision-making and adapt to real–world scenarios.”

She continues.

“Lidar, radar sensors for AVs

More recently, automotive manufacturers have been pushing for fully autonomous driving with the help of sensors. Some companies are specifically manufacturing lidar (light detection and ranging) sensors to assist with object detection.

Hughes Aircraft Co. is widely credited with introducing lidar technology in the early 1960s, primarily designed for satellite tracking with the help of laser-focused imaging that allowed engineers to calculate distances.

Today, many companies are adopting direct time–of–flight lidar sensors that use lasers to emit pulses of light waves that will then bounce off surroundings and obstacles. Lidar then measures the amount of time it takes for those pulses to return, thereby determining the distance between sensor and object. Lidar sensors are also capable of creating a “map” of object surfaces as the light waves strike it.

In real-world scenarios, companies use lidar for a variety of applications to enable machines to perceive the world around them, including warehouse management, advanced driver assistance systems, construction projects, pollution modeling and more. Companies such as Mobileye and Daimler are implementing lidar technology in their self–driving prototypes.

For example, Mobileye’s latest EyeQ Ultra SoC uses four classes of proprietary accelerators known as XNN, PMA, VMP and MPC, which in turn rely on two sensing subsystems: one camera–only component and the other a combination radar and lidar. Mobileye claims EyeQ Ultra SoC would enable autonomous Level-4 driving, defined by the Society of Automotive Engineers as vehicles that can perform all driving functions without manual intervention under specific conditions. If these conditions aren’t met, however, the driver must take control of the vehicle.”

For the complete article on machine perception CLICK HERE.

Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to editor@lidarnews.com and if you would like to join the Younger Geospatial Professional movement click here.

 

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: