Apple Introduces New AR Toolkit: Reality Kit 2

Create Realistic Shared AR Spaces with Reality Kit 2

2
Apple Introduces New AR Toolkit: Reality Kit 2
Augmented RealityInsights

Published: June 10, 2021

Rgreenerheadshot

Rory Greener

US tech giant Apple is moving from strength to strength each week, and XR Today has been reporting on the company’s patents for improved facial biometrics and AR display’s for low-light.

XR Today also reported on the latest updates to iOS 15 this week, which included augmented reality (AR) integration into Apple Maps

But the Cupertino-based firm has now launched the Reality Kit 2: a software development kit (SDK) which makes AR content creation faster and easier.

ARKit and Swift API Integration

Built from the ground up, the Reality Kit 2 contains various toolkits, including its ARKit and Swift API systems, giving developers a strong platform to create mobile AR experiences. 

ARKit also offers developers advanced motion tracking, scene capture, and processing, while Swift API provides a sophisticated programming language to give developers streamlined creation processes. 

Both ARKit and Swift API give the Reality Kit 2 a strong foundation to build upon.

New Features

New and improved features are also available for developers through the Reality Kit 2, improving AR production pipelines, and adds a fantastic new object capture feature.  

Built from the ARKit software, object capture lets developers scan photos from an iPhone or iPad device, turning the image into a high-quality and optimised AR 3D model. 

This gives those with lesser 3D development skills the ability to create interactive objects, and users can import fully-animated 3D assets which can later be modified by the end user. Developers can also add spatial audio sources and synchronise assets across devices for shared AR experiences.  

Using the iPhone 12’s Lidar scanner, developers can add reflections, shadows, camera noise, motion blur, and place AR objects that realistically interact and reflect real-world environments. 

New custom shaders and improved object occlusion work with the Lidar scanner to create immersive 3D objects.

The SDK also offers further support for shared AR spaces with optimised networking and scalable performance models to improve graphics processing unit (GPU) performance.   

Driving full speed into the AR space, Apple’s continued support should lead to its SDK platform getting picked up by a mixture of developers and brands. 

Both can be accessed at Apple’s developer website along with further documentation for review.

DesignImmersive CollaborationWebAR
Featured

Share This Post