The Apple Vision Pro headset is coming. With it comes a new device for the firm’s ecosystem of products which has historically flipped the technology world on its head with culture-changing devices.
Apple looks to repeat its storied paradigm-shifting success with its immersive device and integrate the Vision Pro device into its broader product portfolio.
Most recently, rumours suggest that Apple will introduce AR features in the next iteration of its AirTag product – a device that helps individuals locate misplaced items.
Trusted Apple Analyst Ming-Chi Kuo noted that the AirTag 2 item tracking device will debut in the fourth quarter of 2024, and with its launch will come compatibility with the Vision Pro, which is also due next year.
Kuo explained:
I believe that spatial computing is a new ecosystem that Apple wants to build, using Vision Pro as the core to integrate other devices, including AirTag 2.
Moreover, reports from last year highlighted how Apple added code to iOS version 13, which suggests that users will soon be able to find AirTags via AR visualizations.
AirTag 2 with AR locational features could easily integrate into the Vision Pro MR framework. However, the AirTag AR features may work in tandem with Apple iPhone devices, as modern iPhones come with cameras that support AR visuals. Perhaps meaning iPhone users could find their AirTag 2 devices via AR visual displays through a device’s camera.
Details are slim. However, Apple appears to be ramping up its product ecosystem to work alongside its upcoming Vision Pro device. For example, the Vision Pro will work alongside MacBook, Apple TV, and iPhone – notably allowing users to extend physical screens into virtual ones to meet productivity goals.
More on Vision Pro
Apple’s Vision Pro headset uses a three-layered approach to spatial computing, with “Windows” representing the 2D user interface, “Volumes” providing RT3D immersive experiences, and “Spaces” creating the spatial computing environment in which Volumes and MR applications exist. This three-layered approach ensures users have an accessible, easy-to-use, and flexible spatial computing interface.
Apple’s Vision Pro contains competitive features that empower its three-tier spatial computing structure. The device’s features include a custom M2 Silicon chip, Apple’s purpose-built R1 graphics processor, a 23 million pixel display across two huge micro-OLED lenses, enabling a total display resolution of 4096 x 5464 pixels, high-dynamic range (HDR), wide colour gamut (WCG) outputs, 2-hour battery life, an immersive camera for capturing spatial audio/photos/video for peer-to-peer sharing, iPhone/iPad/Mac synchronization, a light seal, a LiDAR scanner, and a TrueDepth camera.