The LiDAR sensor that Apple debuted on the iPad Pro and after the iPhone 12 Pro and iPhone 12 Pro Max in 2020, as we have seen so far, allows some augmented reality effects thanks to the ability to place objects precisely in 3D space. And although there are some apps that have already taken advantage of it, it is a technology that offers much more than what we know and is ideal for the development of other devices.
As iPhone users, the LiDAR sensor application is noticeable in the quality of the camera of the new iPhone 12, and in very limited demonstrations that some apps allow us so far. But the contributions of this technology can be seen in virtual reality glasses that we hope to see en masse in the coming years. And hopefully Apple and others will rely on advanced 3D maps of the real world to combine virtual objects.
Currently, augmented reality glasses like Magic Leap and HoloLens already pre-scan their surroundings before placing things in it, and Apple’s LiDAR-equipped AR technology works the same way. In that sense, iPhone 12 Pro and iPad Pro are like AR headphones without the glasses And they could pave the way for Apple to eventually make its own glasses.
The LiDAR sensor can be used to flatten 3D objects and rooms and superimpose photographic images on top, a technique called photogrammetry. That could be the next wave of capture technology for practical uses like home improvement or even social media and journalism. The ability to capture 3D data and share that information with others could turn these LIDAR-equipped phones and tablets into 3D content capture tools. LiDAR could also be used without the camera element to acquire measurements for objects and spaces.