Other

Lidar on the iPhone Explained

image of 3D Scanner Selection

Tech giant Apple began including LiDAR sensors in its mobile devices beginning with the 2020 iPad Pro, and today all of its top-end smartphones carry the feature. In fact, you could argue that the majority of Apple’s iPhone and iPad products have LiDAR capabilities if you include Face ID, which also has 3D imaging features. But there are some key differences between the rear-mounted LiDAR on the iPhone and the front-facing Face ID assembly.

From an article in Tech HQ

The Face ID module projects a patterned point cloud of more than 30,000 infrared dots onto surfaces that are roughly an arm’s length (25 – 50 cm) away from the device. If the target surface were perfectly flat then the shape of that point cloud would be unchanged, comparing the optically generated pattern emitted from the iPhone or iPad with the image received by the infrared camera element within the Face ID hardware assembly.

Any variations in the surface reflecting those infrared dots back to the camera will shift the points either closer together or further apart. And the deviations can be used to infer the shape and physical characteristics of the object – in this case, the user’s face – being imaged.

In contrast, returning to the differences between Face ID and LiDAR, the sensor found on the rear of Apple’s products emits fewer infrared marker points, comprising a regular grid of 24 x 24 dots. However, each dot is brighter than the Face ID projected points, which gives the LiDAR on the back of handsets and tablets a much great working range up to 5 m.

And not only can the system determine shape information from distortions in the grid seen when the infrared dots are projected onto the scene in front of the sensor, it can also capture depth information based on time-of-flight. The detection of laser pulses emitted by the LiDAR chip is affected by whether reflecting surfaces are closer or farther away from the device’s infrared source.

Patents such as US-20200256669 (view PDF) show that Apple uses a sparse array of single photon avalanche diodes (SPADs) to perform a kind of optical stocktake on the whereabouts of the infrared range-finding LiDAR emissions. And the iPhone maker’s True Depth Face ID design also features innovations that allows the hardware to perceive distance more accurately, albeit on different length scales.

For the complete article CLICK HERE.

Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to editor@lidarnews.com and if you would like to join the Younger Geospatial Professional movement click here

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.