Handle feature of iPhone 12 Pro: see it in action with this 3D scanning app


The Canvas 3D app scans homes using the iPhone 12 Pro handle. Expect a lot more of that.


O iPhone 12 Pro’s The depth scan handle sensor looks set to open up many possibilities for 3D scanning applications on phones. A new one designed for home scanning, called Canvas, uses handle to add precision and detail. But the app will work with non-professional iPhones since the iPhone 8, too.

Canvas’s approach indicates how coping could work in iPhone 12 Pro applications. It can add more precision and detail to processes that are already possible through other methods on phones and tablets not equipped with coping.

Read More: The handle technology of the iPhone 12 does more than improve photos. Check out this cool party trick

Canvas, created by Occipital based in Boulder, originally launched for the iPad Pro to take advantage of its LIDAR scanning earlier this year. When I saw a demonstration of its possibilities at that time, I saw it as a sign of how Apple’s depth detection technology could be applied to improvement and measurement applications. The updated application makes scans clearer and sharper.

Since the iPhones equipped with deal were launched, a handful of optimized applications have emerged offering 3D scanning of objects, large-scale spatial scanning photography (called photogrammetry) and augmented reality that can mix interconnected maps of spaces with virtual objects. But the sample scan of Occipital’s Canvas app on the iPhone 12 Pro, embedded below, looks sharper than the 3D scanning apps I’ve used so far.

Apple IOS 14 it gives developers more raw access to the iPhone’s deal data, according to Occipital product vice presidents Alex Schiff and Anton Yakubenko. This allowed Occipital to build its own algorithms to use Apple’s depth map deal in the best way. It could also allow Occipital to apply depth mapping data for future improvements to its application for phones not equipped with handle.

Digitizing 3D space without dealing with specific depth mapping or flight time sensors is possible, and companies like 6d.ai (acquired by Niantic) already use it. But Schiff and Yakubenko say that dealing still offers a faster and more accurate update to this technology. The Canvas version for iPhone 12 scans more detailed than the first version on the iPad Pro earlier this year, mainly because of iOS 14’s deeper access to LIDAR information, according to Occipital. The newest version capable of handling is accurate in a range of 1%, while the non-dealing scan is accurate in a range of 5% (literally making the iPhone 12 Pro a professional upgrade for those in need of a boost).

Yakubenko says that by Occipital’s previous measurements, Apple’s iPad Pro deal offers 574 points of depth per frame on a scan, but depth maps can jump up to 256×192 points on iOS 14 for developers. This creates more detail through AI and camera data.

Screen scans of rooms can be converted into viable CAD models, in a process that takes about 48 hours, but Occipital is also working on converting scans more instantly and adding semantic data (such as recognizing doors, windows and other room details) with AI.

As more 3D scans and 3D data start to live on iPhones and iPads, it will also make sense for common formats to share and edit files. While iOS 14 uses a USDZ file format for 3D files, Occipital has its own format for your in-depth scans and can produce in .rvt, .ifc, .dwg, .skp and .plan formats when converting to CAD models. At some point, 3D scans can become as standardized as PDFs. We’re not there yet, but we may need to get there soon.