In a recent Apple developer blog post, the firm chose to highlight the experiences of four XR insiders who spent time working at its Labs developer suite.
Apple’s Developer Labs allow individuals to leverage the firm’s hardware, software, and resources to create applications across the Apple OS ecosystem – including visionOS, iPadOS, or iOS.
The Cupertino, London, Munich, Shanghai, Singapore, and Tokyo lab locations give developers self-directed coding and design opportunities. The spaces also let developers test and optimize immersive applications for visionOS.
The Apple-built spaces encourage XR developers to create content under the guidance of Apple. In the bigger picture, immersive technology learning opportunities – like Apple labs and others – help grow the developer talent pool and innovation.
A Proving Ground for Immersive Development
Michael Simmons, the CEO of Flexibits, said that the experience of working in a Vision Pro Lab for the first time was “fantastical. It felt like I was part of the app.”
Moreover, Simmons explained that a Cupertino-based lab works as a “proving ground” for immersive application innovations and growth. Furthermore, the Vision Pro Labs provide a space for XR developers to push the entire Apple OS ecosystem beyond its limitations.
In combination with the Vision Pro device, the Apple-branded developer labs allow XR creators to experience spatial computing, expanding their AR/MR/VR knowledge.
Simmons explained:
A bordered screen can be limiting. Sure, you can scroll, or have multiple monitors, but generally speaking, you’re limited to the edges. Experiencing spatial computing not only validated the designs we’d been thinking about — it helped us start thinking not just about left to right or up and down, but beyond borders at all.
Building for a Boundless Canvas
An XR developer, David Smith, explained that the Apple Lab he visited “checked everything off my list.”
Smith explained:
With Apple Vision Pro and spatial computing, I’ve truly seen how to start building for the boundless canvas — how to stop thinking about what fits on a screen and that will help us make better apps.
Moreover, Smith gave details on the VisonOS’ on-site testing feature, which allows developers to test spatial applications in a Vision Pro spatial environment,
Smith added:
I’d been staring at this thing in the simulator for weeks and getting a general sense of how it works, but that was in a box. The first time you see your own app running for real, that’s when you get the audible gasp.
The experienced developer remarked how an Apple lab visitor can start to understand questions only answerable once a developer is on-device
Smith also noted:
It’s not necessarily that I solved all the problems — but I solved enough to have a sense of the kinds of solutions I’d likely need. Now there’s a step change in my ability to develop in the simulator, write quality code, and design good user experiences.
Porting Pre-Existing Applications
Chief Experience Officer of Spool Ben Guerrette and his team leveraged the Apple XR lab to explore new spatial interactions for its smartphone application.
Guerrette explained:
What’s different about our editor is that you’re tapping videos to the beat. Spool is great on touchscreens because you have the instrument in front of you, but with Apple Vision Pro you’re looking at the UI you’re selecting — and in our case, that means watching the video while tapping the UI.
However, the Spool team didn’t know if the application would work on the Vision Pro OS, but the group succeeded.
The chief experience officer also added:
Now we understand where to go. That kind of learning experience is incredibly valuable: It gives us the chance to say, ‘OK, now we understand what we’re working with, what the interaction is, and how we can make a stronger connection.
Getting Hands-On with the Apple Vision Pro
Chris Delbuck, a Principal Design Technologist at Slack, explained how the labs allow developers to get hands-on with the Apple Vision Pro device, therefore allowing developers to understand the potential of XR and spatial computing.
Delbuck said:
It instantly got me thinking about how 3D offerings and visuals could come forward in our experiences.
The Slack representative explained that they would not have fully understood the potential of XR “without having the [Vision Pro] device in hand.”
More on Apple Vision Pro
Apple readying its spatial computing device as 2024 ramps up to be a busy year for XR – with MR roadmaps arriving from Meta, Microsoft, and Google.
Notably, while some audiences are sceptical over Apple’s pricey entry into the market, competitors recognize Apple’s potential. Earlier this month, reports highlighted how Google and Samsung face a “great fear” when Apple releases new products.
The Vision Pro device is due in 2024 for $3,499. Apple’s Vision Pro headset sets itself apart using a three-layered approach to ensure users have an accessible, easy-to-use, and flexible spatial computing interface.
- Windows that represents the device’s 2D user interface.
- Volumes that provide RT3D immersive experiences.
- Spaces create the spatial computing environment in which Volumes and MR applications exist.
This three-layered approach to Apple’s Vision Pro contains competitive features that empower its three-tier spatial computing structure – including a custom M2 Silicon chip, Apple’s purpose-built R1 graphics processor, a 23 million pixel display across two huge micro-OLED lenses, enabling a total display resolution of 4096 x 5464 pixels, high-dynamic range (HDR), wide colour gamut (WCG) outputs, 2-hour battery life, an immersive camera for capturing spatial audio/photos/video for peer-to-peer sharing, iPhone/iPad/Mac synchronization, a light seal, a LiDAR scanner, and a TrueDepth camera.
Despite growth and outreach plans via its Lab spaces, Apple is facing some hurdles in getting Vision Pro devices to market.
In July, Reports suggested that Apple’s overseas manufacturing partner Luxshare Precision Industry Co. reduced its initial product assembly forecast to 400,000 units, down from Apple’s 1 million unit forecast and Luxhare’s internal forecast of producing 18 million units annually in the coming years.
Two of Apple’s component manufacturing partners also reduced production forecasts to roughly 130,000 to 150,000 units.
According to the July reports, Apple is facing manufacturing “complexity”, leading to “difficulties” in production stemming from its micro-LED and curved, outward-facing lens. The firm also expressed dissatisfaction with some of its production partners,
The news came following reports highlighting how the production of Apple’s Vision Pro device will only cost its manufacturers roughly $1,590, far less the device’s $3.499 market price.