3D Modeling AR/VR Lidar Technology

What Possessed Apple to Put a Lidar Sensor in the iPhone?

image of What Possessed Apple to Add Lidar?

For the past six months I’ve been staring at the backside of my iPhone 13 Pro wondering what possessed Apple to build a Light Detection and Ranging (LIDAR) camera into its flagship smartphone.

From an article in The A Register by Mark Pesce.

It’s not as though you need a time-of-flight depth camera, sensitive enough to chart the reflection time of individual photons, to create a great portrait. That’s like swatting a fly with a flamethrower – fun, but ridiculous overkill. There are more than enough cameras on the back of my mobile to be able to map the depth of a scene – that’s how Google does it on its Pixel phones. So what is Apple’s intention here? Why go to all this trouble?

The answer lies beyond the iPhone, and points to what comes next.

In the earliest days of virtual reality, thirty years ago, the biggest barrier to entry was compute capacity necessary to render real-time three-dimensional graphics. Back in 1992, systems capable of real-time 3D looked like supercomputers and cost hundreds of thousands of dollars.

For the computer to see the world, it must be able to capture the world. This has always been hard and expensive. It requires supercomputer-class capabilities, and sensors that cost tens of thousands of dollars … Wait a minute. This is sounding oddly familiar, isn’t it?

Until just two years ago, LIDAR systems cost hundreds to thousands of dollars. Then Apple added a LIDAR camera to the back of its iPad Pro and iPhone 12 Pro. Suddenly a technology that had been rare and expensive became cheap and almost commonplace. The component cost for LIDAR suddenly dropped by two orders of magnitude – from hundreds of dollars per unit to a few dollars apiece.

Apple needed to do this because the company’s much-rumored AR spectacles will necessarily sport several LIDAR cameras, feeding their M1-class SoC with a continuous stream of depth data so that the mixed reality environment managed by the device maps neatly and precisely onto the real world. As far as Apple is concerned, the LIDAR on my iPhone doesn’t need to do much beyond drive component costs down for its next generation of hardware devices.

For the complete article on what possessed Apple… CLICK HERE.

Note – If you liked this post click here to stay informed of all of the 3D laser scanning, geomatics, UAS, autonomous vehicle, Lidar News and more. If you have an informative 3D video that you would like us to promote, please forward to editor@lidarnews.com and if you would like to join the Younger Geospatial Professional movement click here

1 Comment

  • They are using the device owners as employees without paying them, it’s apples top policy, the strongest part of their enterprise is built on it. Meanwhile their ‘fans’ pay top $$$ for their loyalty

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: