U C Berkeley researchers developed a more compact and higher-resolution light detection and ranging, or LiDAR, system, which is used by self-driving cars and other autonomous machines to detect surrounding objects.
From an article in The Daily Californian by Aileen Wu.
According to Ming Wu, campus electrical engineering and computer science professor and leader of this research, LiDAR is a sensor that maps 3D landscapes by emitting a laser and measuring the time it takes to return to the device. However, previous LiDAR designs have been expensive and bulky. The new design, featured in Nature on Wednesday, could make LiDAR technology cheaper and smaller.
“Normal cameras involve a 2D sensor sensing an image independent of distance and compressed on a plane,” Wu said. “For many applications, from self-driving cars to robotics to drone navigation, it’s important to have that 3D landscape to avoid obstacles and plan their path.”
Xiaosheng Zhang and Kyungmok Kwon, co-first authors of the study, emphasized the need to shrink the size of LiDAR while maintaining its performance.
One of the major challenges with this task, according to Zhang and Kwon, is making an integrated beam scanner, which “steers” the output laser light toward different directions to scan across the scene. The LiDAR beam scanner is the spinning object on the roof of self-driving cars.
“We made a LiDAR with an integrated beam scanner on a silicon photonics chip of 10 mm x 11 mm, which we call a focal plane switch array,” Zhang and Kwon said in an email. “Focal plane switch array … can miniaturize and integrate (the) LiDAR system into a single chip.”
Wu added the single chip makes LiDAR resemble a smartphone camera. However, LiDAR is more complex than smartphone cameras because each pixel needs to receive light but also transmit a laser that “bounces off a target and returns to the LiDAR camera.”
The group’s design uses tiny microelectromechanical system, or MEMS, switches instead of more common thermo-optic switches. MEMS switches physically move the waveguide, a tube that guides electromagnetic waves, up and down to route the light to each pixel, according to Zhang and Kwon in the email.
For the complete article on U C Berkeley lidar sensor click HERE.
Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to firstname.lastname@example.org and if you would like to join the Younger Geospatial Professional movement click here.