Along with iOS 13.4 and iPadOS 13.4, Apple has released ARKit 3.5 to let developers take advantage of the new LiDAR Scanner in the new iPad Pro. The latest ARKit release features Scene Geometry, Instant AR, and improved Motion Capture and People Occlusion.
Apple announced the update in a post on its developer site today:
ARKit 3.5 takes advantage of the new LiDAR Scanner and depth-sensing system on iPad Pro to support a new generation of AR apps that use Scene Geometry for enhanced scene understanding and object occlusion. And now, AR experiences on iPad Pro are even better with instant AR placement, and improved Motion Capture and People Occlusion — all without the need to write any new code.
Here’s how Apple describes the three major changes that come with ARKit 3.5:
Scene Geometry lets you create a topological map of your space with labels identifying floors, walls, ceilings, windows, doors, and seats. This deep understanding of the real world unlocks object occlusion and real-world physics for virtual objects, and also gives you more information to power your AR workflows.
The LiDAR Scanner on iPad Pro enables incredibly quick plane detection, allowing for the instant placement of AR objects in the real world without scanning. Instant AR placement is automatically enabled on iPad Pro for all apps built with ARKit, without any code changes.
Improved Motion Capture and People Occlusion
With ARKit 3.5 on iPad Pro, depth estimation in People Occlusion and height estimation in Motion Capture are more accurate. These two features improve on iPad Pro in all apps built with ARKit, without any code changes.
Apple has shared some examples for how the LiDAR Scanner on the iPad Pro will improve some of the existing AR experiences out there but we’ll have to wait until this summer or fall to hear more about all-new AR tools and apps that could be enabled by the new hardware.