Go read this analysis of what the iPad Pro’s LIDAR sensor is capable of

BY admin April 27, 2020 Apple 5 views

An in-depth analysis of the iPad Pro’s camera array has given us our best idea yet of what Apple’s first LIDAR-equipped device is capable of right now — and what it might be able to do in the future. 

Photo by Brennan King / The Verge

 

An in-depth analysis of the iPad Pro’s camera array has given us our best idea yet of what Apple’s first LIDAR-equipped device is capable of right now — and what it might be able to do in the future. While it’s unlikely to be of much use for portrait photographs, the analysis points toward some intriguing augmented reality use cases for LIDAR. The analysis was done by Sebastiaan de With, one of the developers behind the Halide and Spectre camera apps, and the author of similar breakdowns of the camera hardware on the iPhone 11iPhone 11 Pro, and iPhone XS. All of them are worth a read for anyone with a passing interest in smartphone photography.

The bad news is that the iPad Pro’s new LIDAR sensor probably isn’t going to be of much use for aiding traditional photography. While there’s been some speculation that its depth-sensing capabilities could be used to sense subjects and blur backgrounds as part of a portrait mode, de With notes that it’s just not high-resolution enough to be able to sense faces. LIDAR sends out fewer dots than, say, the Face ID sensor, and they’re spaced farther apart. Plus, as of now, there aren’t any APIs available for developers to get the raw depth data from the sensor.

When the latest iPad Pro was announced, Apple emphasized that its LIDAR sensor was instead meant to be used for augmented reality, and de With notes that, here, its abilities are much more impressive. “AR experiences are way more seemless and accurate,” de With says. “You no longer having to ‘calibrate’ the device by waving it around in space. It can detect and classify features in your room like windows. It can even measure how tall people are.” De With outlines one concept that the Halide team is working on that uses the sensor to scan a room and turn it into a 3D object that you can upload and share with others.

So there’s bad news and good news when it comes to the iPad Pro’s cameras. The bad news is that, for traditional photography, you’re getting a similar sensor and image quality to an older device like the iPhone 8. It’s not a terrible shooter by any means, but it can’t match the iPhone 11 or a Pixel. The good news is that the LIDAR sensor could be capable of some really interesting augmented reality use cases, so long as there’s enough incentive for developers to build in support for it.

Comments

write your comment.

Your email address will not be published.