Apple execs talk about how the iPhone continues to push AR forward


While many people believe that headsets and glasses are the future of augmented reality, Apple executives show how the iPhone is already turning the AR field on its ear.

By including the depth-sensing LiDAR scanner into its newest line of high-end iPhones, Apple has shown that they’re taking AR seriously.

CNet sat down with Apple’s AR executives Mike Rockwell and Allessandra McGinnis to discuss Apple’s plans for augmented reality, why LiDAR has made its way to the iPhone, and why the iPhone could be the future of AR.

“AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we’ve got to make sure that it is successful,” said Rockwell, Apple’s head of AR. “For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort.”

He makes a solid point. Including AR in a multi-purpose device that people will actually use is an excellent way to encourage both development of and engagement with AR. Standalone devices, such as virtual reality headsets, are a hard sell to the average consumer. Not only are they expensive, but they also exist solely to provide a VR experience — and therefore don’t make it into the hands of many people.

“It’s been a pretty hard road for developers that are VR-only, or are trying to do AR-only experiences,” Rockwell notes. “There just aren’t that many [devices] out there.”

An example of how the iPhone 12 Pro LiDAR sensor maps depth of real-world objects | Image Credit: Apple

CNet points out that Apple has sold hundreds of millions of AR-enabled iPhones and iPads in the past three years.

Additionally, the iPhone and iPad allow developers to explore the more practical side of augmented reality. Retailers, such as Ikea and Home Depot, take advantage of ARKit‘s “Quick Look” feature.

Apple has shifted a lot of effort to developing AR, especially as the coronavirus continues to limit consumers’ ability to shop in person. Giving customers the ability to preview items in their own homes provides benefits for retailers and consumers alike.

“Home Depot’s found that people are two to three times more likely to convert when they view a product in AR than others that don’t,” McGinnis, Apple’s senior product manager for AR, points out.

Over 10,000 augmented reality apps are available on the App Store, and many developers seem content to develop exclusively for the smartphone market. Adobe, known for programs like Photoshop and Illustrator, has gone on record, stating that they’re following Apple and Google’s lead, rather than developing for something like Facebook’s Oculus.

“Headsets are on our roadmap, None of them has reached the critical mass that makes sense for us to deploy,” Adobe’s head of AR, Stefano Corrazza, says as to why the company hasn’t explored headset creative tools beyond acquiring Medium from Oculus: “Until we have an Apple or Google putting something out there in broad scale, it doesn’t make a lot of sense for us to push it out.”

An AR image made using Adobe's Aero iOS App | Image Credit: Adobe

An AR image made using Adobe’s Aero iOS App | Image Credit: Adobe

Apple is keeping creatives in mind, too. By including LiDAR in the iPhone 12 Pro, creatives can quickly scan real-life objects and create 3D assets to be used in games, shopping apps, and more.

“That’s part of the reason why we put this scanner on the device. We felt like it was a key technology that could open up an explosion of 3D assets that can be used for all kinds of things,” Rockwell says. “It also opens the possibility of being able to start to scan environments in a way, and be able to make it easier to create 3D objects.”

The use-case for AR doesn’t stop there, either. As it turns out, the iPhone could champion in future leaps in accessibility accommodations, particularly for the blind and partially-sighted.

“There’s a lot more we can do, especially related to our understanding of the environment that is around us,” Rockwell says. “We can recognize people, but if you think about what a human being can understand about an environment, there’s no reason that in the fullness of time a device can’t have that level of understanding, too, and provide that to developers.”

But where Apple really sees AR shining is in the everyday. Combining AR with App Clips could allow for immersive experiences in the real world.

“Something that you’re dipping in and out of three, four, five, six times a day to do various things, and they’re lightweight experiences,” Rockwell says.

Tapping a QR code, as CNet points out, could allow a user to access a restaurant’s virtual menu or bring a museum exhibit to life.

“The killer app is really that it’s going to be used in a kind of regular basis all the time in these little ways that help you to do the things that you do today, that make them easier and faster.”



Source link