Why a Rear LiDAR Scanner on the New iPhone 12 is a Game Changer

3 iPhone 12's laying facedown on a wooden table

The news that Apple would be including LiDAR on its latest slate of iPhones has earned the company a great deal of buzz regarding the capabilities of this technology. Already a key attribute of self-driving cars, augmented reality, and other robotics tech, LiDAR’s presence in the latest iPhone brings the technology beyond its industry-specific uses and into a general user’s palm.

Although the same light-detecting capabilities were built into the latest model iPad Pros, the notion of having a LiDAR-equipped device in your pocket has opened up a new swath of possibilities. It particularly affects AR and 3D scanning. Here’s why the technology is such a game-changer for iPhone users, and where its wider adoption will bring us in the future.

Key Takeaways

  • The new iPhone 12 includes LiDAR, enhancing AR capabilities for everyday users.
  • LiDAR allows for advanced scanning, with a range of up to five meters and six times faster camera focus.
  • Apple aims to improve user experience in augmented reality, enabling practical applications like accurate measurements.
  • The integration of LiDAR with ARKit 4.0 offers developers tools for innovative apps and experiences.
  • As the best-selling phone of 2020, the new iPhone 12 is set to drive demand for 3D scanning applications.

Advanced Scanning

While the high-end iPhones will continue to use TrueDepth cameras for user-facing applications, the world-facing LiDAR scanning on the iPhone 12 Pro versions advances the device’s scanning capabilities. It’s the difference between a few feet of range for TrueDepth versus up to five meters of range for LiDAR. This is a paradigm shift when it comes to range, and one that users will become familiar with immediately when they start taking photos with their new iPhone.

Apple claims this technology allows the camera to focus six times faster than previous models, a claim attested to by the positive reviews coming out for the iPhone 12. Low-light and night mode portraits, in particular, are substantially improved by utilizing LiDAR. The upgrade is visible to any amateur or professional smartphone photographer.

These exponential improvements in photo quality only scratch the surface of what the technology is capable of. However, they make for an excellent introductory point for users eager to put their iPhone to greater practical use. Further capabilities added by LiDAR will look beyond the quotidian ways we use our phones and toward a more virtual future.

The Here & Now

Augmented Reality (AR) apps have grown steadily in popularity, utilizing modern smartphone cameras and displays to create an illusion of holographic content. With the LiDAR release, Apple has clearly indicated its dedication to improving the user experience in this field. In fact, they’ve said so themselves.

In their announcement of the LiDAR capabilities in the most recent iPad, the company notes that it can be used to “…automatically calculate someone’s height” and to “let users more quickly and accurately measure objects.” These practical functions for the laser-scanning devices don’t just build on what we’ve done with our phones thus far. They ensure new applications as part of an ever-growing list of functions.

Some popular examples include measuring rooms to ascertain the right fit and style when shopping for furniture or measuring the mobility of a physical rehabilitation patient through live scanning. These instances, of course, are meant for professionals, such as interior designers and physical therapists, who are likely to have already invested in an iPad to upgrade their business capabilities. With that in mind, what does LiDAR offer for the rest of us?

Everyday AR

The iPhone 12 brings LiDAR into every user’s pocket, turning it from a novel addition to a practical tool for everyday use. When getting a fast, highly accurate 3D scan of any object within five meters becomes a reality, users can then utilize LiDAR for their own personal uses. This leads to more industry capabilities surrounding the technology.

Apple has signaled its interest in AR to app developers for years, going back at least to 2017, when the company first released its ARKit for developers. With the 2020 release of ARKit 4.0, Apple has created its most advanced environment yet for development. 

These 4.0 features, such as location anchors utilizing Apple Maps data or Face Tracking on the user-facing camera, can be combined with LiDAR for a cohesive user experience. This experience goes beyond what was previously possible. The technological innovation AR developers were hoping for is quickly catching up to their goals.

The pace of adoption in AR depends on the companies supplying LiDAR-equipped apps to the market. Given that the iPhone 11 has already become 2020’s best-selling phone, the impact Apple’s next release will have on consumer demand will likely be enormous. This will spur greater development of 3D scanning applications meant for everyday use. Apple has provided the toolkits, and now, with LiDAR, the technology needed is ready to bring AR to its widest use yet.

Subscribe

* indicates required