Apple Opens Vision Pro Developer Program Applications

Developers and teams worldwide can now code for the celebrated mixed reality headset

3
Mixed RealityLatest News

Published: July 25, 2023

Demond Cureton

The Apple Vision Pro has opened applications for its Vision Pro Developer Kit, the company revealed on Monday.

In a press statement, the Cupertino-based firm revealed it had opened its developer programme for the long-awaited headset. This will allow coders to test their apps on the novel spatial computing platform.

The developments follow the product’s massive debut at the Worldwide Developer Conference (WWDC) in early June. At the event, numerous execs, including Tim Cook, Chief Executive, Apple, showcased the headset and its many features.

Following up on the event, Apple will allow developers to tap its visionOS software developer kit (SDK) for developing and testing apps on Apple devices.

Getting with the Programme

Select developers can also receive the Vision Pro before 2024 to conduct testing and trialling via the programme. The Developer Kit aims to create “amazing spatial experiences by letting you quickly build, iterate, and test on the Vision Pro,” according to the press release.

Developer Kit access will also allow users to receive support on coding and creating with Apple engineers. To do so, potential coders must register with the Apple Developer Program, indicating the skills and assets they can contribute to the device’s operating system.


Apple will later review the requests and pick a select few developers to work with the programme. Those selected to receive Vision Pro headsets will also receive it as a loaned device, with Apple conducting regular check-ins on the headset.

Developers must agree to keep the product in a “private, secure workspace accessible only by you and your authorized developers.”

Top Features in the Apple Developer Toolbox

According to the website, Apple developers will receive access to the following tools:

  • SwiftUI to create windows, volumes, and spatial experiences for visionOS, iPadOS, or iOS. It supports 3D capabilities, depth, effects, gestures, immersive scenes, and RealityKit integration.
  • RealityKit, Apple’s 3D rendering engine for 3D content, visual effects (VFX), and animations. Users can tweak physical lighting conditions and shadows, create portals across immersive environments, build VFX scenes, and others. It also leverages the open standard MaterialX for surface and geometry shading.
  • ARKit, allows users to interact with apps in physical and immersive spaces. ARKit also provides Shared Space support for virtual environments with application programme interfaces (APIs) such as Plane Estimation, Image Anchoring, World Tracking, Scene Reconstruction, and Skeletal Hand Tracking.
  • Accessibility, for assisting people with special needs using the device. People can control the device with only their eyes, voice, or both biometrics. It also utilises Pointer Control, where users can control the device with their index finger, head, and wrist.

The SDK will also integrations for Xcode for building and previewing new apps. The programme also includes Reality Composer Pro for preparing 3D content for visionOS apps with sound, modelling, and materials,

Finally, the developer programme will offer Unity support for creating apps with Unity compatibility for passthrough and Dynamically Foveated Rendering. The latter will also integrate with Apple’s RealityKit app rendering capabilities to maximise compatibility between 3D platforms.

 

 

DesignImmersive ExperienceMixed Reality HeadsetsVisualization

Brands mentioned in this article.

Featured

Share This Post