The iPhone 12 Pro is Apple’s first smartphone with a Light Detection and Ranging (LiDAR) scanner on the back. But what does LiDAR do, and what is Apple planning to use it for in the future?
What Is LiDAR?
A LiDAR scanner determines the distance between itself and an object by monitoring how long it takes a pulse of light (often a laser) to bounce back. It’s like radar, except instead of radio waves, it uses infrared light.
While radar is designed to be used across great distances, LiDAR works on a smaller scale, due to the way light is absorbed by objects in its path. By sending hundreds of thousands of light pulses every second, LiDAR scanners can work out distances and object sizes with relative accuracy over small distances.
This data can then be used to construct 3D models, which is one of the main uses for LiDAR in construction and engineering projects. You’ve likely heard of 3D laser scans being used to draw up building plans—that’s LiDAR.
LiDAR actually has many uses across many industries. Archaeologists use it to prepare dig sites and autonomous vehicles rely on it to construct real-time 3D maps of their surroundings. LiDAR has even been used to create highly realistic and accurate maps of race tracks in video games, like Project CARS. Police speed guns also use LiDAR.
And now, just like the iPad Pro in March 2020, a LiDAR scanner has come to Apple’s premium iPhone 12 Pro.
How the iPhone 12 Pro Uses LiDAR
Apple uses LiDAR a bit differently than a construction site or speed gun. It’s the same basic principle—bouncing light to determine distance—but on a smaller scale. The LiDAR scanner in the iPhone 12 Pro (and iPad Pro) has an effective range of around 16 feet (5 meters).
The primary purpose of LiDAR in the iPhone is to improve augmented reality (AR) implementation. It will give apps more useful and accurate information about their surroundings, for smoother, more reliable AR.
If you’re unfamiliar with the technology, AR allows developers to fuse virtual objects and the real world. It uses your device’s camera and allows you to play games, apply interactive filters (like those on Snapchat), or preview the placement of furniture and other objects.
Pokémon Go is one example of a successful AR game that allows you to capture virtual creatures in the real world. With Ikea’s wildly successful Place app, you can see how most of the company’s catalog would look in your home.
LEGO is one of many companies that has also launched stand-alone products (in this case, building sets) that can “come alive” via AR features when you have a compatible smartphone.
While LiDAR is often used to scan buildings and other objects, the scanner on the iPhone 12 Pro and iPad Pro isn’t accurate enough to scan objects precisely. Sebastiaan de With, who developed the popular iPhone camera app Halide, discovered this while building a proof of concept called Esper.
“Unfortunately, the mesh output by the system right now isn’t accurate enough to send to a 3D printer,” de With wrote on the Halide website. “But it’s a great starting point for a 3D model, since all the proportions will be very accurate.”
In reality, LiDAR scanners are likely to improve on two main things: virtual object placement (like shopping apps) and AR gaming. These are already possible on non-LiDAR iPhones, but it adds an extra layer of accuracy to things like dimensions and the precise distance to an object in a room.
You can also expect a more seamless AR experience, particularly when placing virtual items in the real world. For example, the iPhone 12 Pro should be better able to identify real-world items in the foreground. This should make for more realistic interactions between virtual and real objects.
Apple also intends to use LiDAR to improve camera performance in low light. It’s implemented “focus pixels” in the iPhone XS, which is the company’s brand of phase-detect autofocus (PDAF). This technology still depends on light, which is why even the latest autofocus advancements don’t work all that well in the dark.
By sensing the distance between your iPhone and the subject you’re taking a picture of, Apple can tell the camera at what distance it should focus to get the best results. This should make it much easier to take better photos with your iPhone in the dark, especially when combined with Night mode.
RELATED: What Are the ARCore and ARKit Augmented Reality Frameworks?
Will LiDAR Become a Big Deal?
Presently, only two Apple devices have a sensor. Both are also priced at a premium and feature the “Pro” moniker, so LiDAR is a niche feature for now. However, that doesn’t mean software will be slow to catch on. Apple’s exhaustive list of software development kits (SDKs) includes ARKit, which was updated to version 4.0 in June 2020.
This update rolled out new features that leverage LiDAR into ARKit, allowing developers to take advantage of the iPad Pro and iPhone 12 Pro’s new sensor. SDKs like this make it possible for developers to target whole families of devices, even if they aren’t rocking the latest bells and whistles.
Apple’s plan likely involves putting LiDAR sensors in more devices over time, while developers are busy building apps that take advantage of the improved performance. Considering the company’s renewed interest in the technology over the last few software releases, Apple seems to be betting big on AR.
Apple’s biggest plans for LiDAR, however, might go well beyond tablets and smartphones. At least, that’s the opinion held by many analysts, as rumors swell about the company’s rumored AR glasses. If such a project were to come to fruition, it makes sense that accurate AR would be foundational to the experience.
By encouraging developers to embrace AR, Apple can accelerate app availability on a new wearable platform. A slow rollout in a few high-end models follows Apple’s trend with past iPhone features, including haptic feedback, facial recognition, and multiple cameras.
Integrating hardware that directly benefits AR into select devices also gives the company the opportunity to fine-tune it before launching a product that leans more heavily on the technology.
Is LiDAR Worth an Upgrade?
When it comes to deciding between the iPhone 12 or iPhone 12 Pro, LiDAR is unlikely to sway you. Unless you use a lot of apps that take advantage of AR or shoot lots of photos at night, you won’t see any benefits in the short-term.
Even if you’re a keen AR gamer or flat-pack addict, AR implementation in current non-LiDAR-enabled iPhones has improved dramatically in just a few generations. LiDAR does improve this even further, but it’s probably not worth the $300 premium Apple is asking for the iPhone 12 Pro.
Most people might not be overexcited about LiDAR right now, but the technology will trickle down and improve the overall iPhone experience in years to come. It also isn’t the only big advancement in the 2020 iPhone lineup—there’s a new ecosystem of MagSafe accessories, 5G support, and Dolby Vision recording.
RELATED: What 5G Means for Apple's iPhone 12
- › iRobot Roomba Combo j7+ Review: Cleans Well but Lacks Some Advanced Features
- › How to Use Cinematic Mode to Shoot Better Video on iPhone
- › There Are a Lot of Sensors in Your Phone, Here’s What They Do
- › Your iPhone Pro Has LiDAR: 7 Cool Things You Can Do With It
- › How Does My Phone Know Which Way I’m Holding It?
- › You Can Wash Your Motherboard In a Dishwasher (But You Probably Shouldn’t)
- › How to Know If Someone Restricted You on Instagram
- › How to Enable Dark Mode in Real Life