Apple today revealed the iPhone 12 lineup, and the onboard LiDAR camera in the top-of-the-line Pro model could make it the best phone ever for AR experiences.
Apple is a big supporter of augmented reality technology. The company’s ARKit platform allows developers to build robust AR experiences for Apple’s iPhone and iPad devices that blend virtual objects and assets with the real-world. iOS devices have always been capable AR viewers, but the new LiDAR-equipped Pro devices take things to a new level.
Apple released the iPad Pro with onboard LiDAR sensors earlier this year, and later this month, the iPhone Pro and Pro Max will ship with the same embedded camera technology. The LiDAR camera enables Apple’s new devices to run depth scanning to capture your surroundings and detect surfaces in real-time, which gives them much better positional tracking and object anchoring capabilities than other mobile devices.
The LiDAR scanner offers the accuracy needed to replicate the exact geometry of objects around you, allowing virtual items to interact realistically with your environment. The highly detailed depth map and scene geometry that the LiDAR scanner produces also enables accurate virtual object occlusion. In other words, if a virtual ball rolls behind the couch in your living room, it won’t look like it disappeared into the couch.
The LiDAR system is only available in the Pro versions of the iPad and iPhone, so the average person won’t get these features. We may see a trickle-down effect, where the LiDAR system makes its way into the standard iPhone in the next generation. If this LiDAR technology works well, we would expect to see these cameras in Apple’s rumoured in-development AR headset.