Head of Verizon’s XR Lab: ‘Apple’s use of LiDAR in the iPhone 12 Pro is a huge step in augmented reality’
Apple forever changed the course of 5G last week when it announced the iPhone 12, iPhone 12 Pro, iPhone Pro Max and iPhone 12 mini, the first models in the device series to support 5G. The smartphones, which support both millimeter wave and sub-6 5G, are powered by the new A14 Bionic processor, which Apple claimed is the first smartphone chip built on a 5nm process and is “up to 50 percent faster” than the leading chips in Android smartphones.
Its support of next-generation cellular technology, however, is not the only significant feature in the iPhone 12. When it comes to the iPhone 12 Pro, in particular, Apple has included a technology called LiDAR, which stands for light detection and ranging, and while this technology has been around for some time, its inclusion in the latest iPhone has the potential to do big things for consumer augmented reality (AR).
“If you look at the iPhone 12 launch, they just incorporated LiDAR into the phone, which is a huge step in augmented reality,” the head of Verizon’s XR Lab, T.J. Vitolo, told RCR Wireless News, “specifically because it allows you to create much more complex experiences.”
He went on to explain that AR today is understood today by how it functions in apps like Pokémon Go and Snapchat, in which a flat digital graphic, called a sticker, is applied on top of the environment.
“There’s really no depth associated with that,” provided Vitolo. “LiDAR allows you to map the entire environment to provide that sense of depth, as well as what is really key to this: occlusion. If I want to place a digital coffee mug on the table and then I place my had in front of that coffee mug, the LiDAR sensor will allow me to have that coffee mug be hidden behind my hand.”
LiDAR accomplishes this by using lasers to bounce off objects that then return to the source of the laser. By measuring the duration of time it took for the lasers to travel, distance between objects is determined, and therefore, a 3D space can be mapped.
Basically, the iPhone 12 Pro sends out waves of light pulses, allowing the camera to “mesh” the dimensions of a space and the objects in it, so that those objects can interact more realistically with any digital object placed within them.
“Outside of creating a much better sense of realism for augmented reality,” Vitolo continued. “It allows you to up the ante on the type of games you can create for augmented reality. Now, twenty of my friends can go attack 300-foot monster sitting in the center of downtown that’s being hidden behind trees and houses.”
Vitolo’s focus in on mobile gaming, so naturally, he sees Apple’s use of LiDAR as critical for the advancement of gaming. However, there are many other applications that could possibly leverage LiDAR in smartphones.
“LiDAR gives you the ability to understand your physical self — does that mean I can try on a shirt or pants?” Vitolo posited.
LiDAR has the potential to be huge for the reality space, allowing consumers to try on clothes virtually or more accurately see what a piece of furniture looks like in their living room. But, as Vitolo pointed out, the possibilities are endless.
“Once this tool is in the hands of the developers, it really gives them the opportunity to experience and try new things, so I really think that things that we’ve heard about are just scratching the surface,” he said.
Because of Apple’s prominence in the marketplace, LiDAR in the context of smartphones is expected to take off and create meaningful momentum for AR.
“When Apple jumps in and does something, they are seeing a market trend. This is a big signal to the market that augmented reality is real and it is going to be something significant in the future,” Vitolo said. “You’re going to start seeing augmented and mixed reality really advance over the next couple of years.”
The post LiDAR in the iPhone 12: What is it and what does it mean for augmented reality? appeared first on RCR Wireless News.