AR is one of 2024’s most significant XR trends, and that does not look to change leading into the new year.
As the years and, in turn, XR technology progresses, as will AR visualisations. Advancements in displays, ecosystems, and frameworks will enable AR services to fit more optimally into workplace settings and everyday life.
Today, users can experience AR services via a smartphone or tablet thanks to advanced built-in display, tracking, and camera technology.
However, in the future, AR smart glasses may easily become commonplace as a more optimal hardware solution to interact with 3D spatial content; but don’t forget AR smart glasses exist today via vendors like Vuzix, XREAL, Meta/Ray Ban, Rokid, Xiaomi, INMO, and countless other innovators-this hardware market will only improve as the decade continues.
To discuss some of the key trends behind augmented display technology, XR Today spoke with David Weinstein, director of Virtual Reality and Augmented Reality at NVIDIA, Stefan Baumgart, Director of Product Management at TeamViewer, Kevin Joyce, CEO of Tiny Brains, as well as spokespeople from HP and Vuzix.
Advancements in 2024
Joyce
While many may argue that the Apple Vision Pro (AVP) is the biggest innovation in AR (which is AR, regardless of Apple trying to coin it as ‘spatial computing’!), personally, I feel that the real movement has been made elsewhere. New hardware is great – AVP and XReal Air 2 Ultra being particular standouts – but hardware is nothing without the software to support it. And that software can’t be created without new toolsets.
Since 2018, both Google and Apple have been pushing AR at a somewhat glacial pace. ARCore and ARkit were great innovations in their time, but access to new features saw slow rollouts.
Here in 2024, however, there’s now a wealth of new tools to help developers innovate within AR software, offering access to more sensors and processing power than ever before.
These new features can enhance the accuracy of AR, resulting in more realistic and/or immersive experiences.
Then there’s the likes of 8th Wall and STYLY, offering access to AR technologies with minimal coding expertise required.
I’ve often said the low- or no-code solutions are important for immersive media, and AR certainly has a wealth of options available in this regard.
HP
In 2024, the landscape of augmented display technology saw several significant innovations, making it challenging to pinpoint just one or two as the most important. However, some key trends stood out.
On the hardware side, there was a notable focus on making augmented displays more suitable for daily wear.
This effort to improve the form factor is a significant achievement, as it enhances the practicality and usability of these devices for everyday use.
Vuzix
There are a number of technical factors driving the mainstream adoption of wearable displays and waveguides in particular, ranging from the physical limitations of various material substrates to more human-centered social factors that impact our overall willingness to wear a display.
In this latter category, Vuzix made significant strides towards industry adoption with two major innovations in 2024. The first is our ability to create waveguides with integrated prescription lenses.
When it comes to head-worn wearables, weight is a primary factor, and separate prescription lenses can add a significant burden. Integrated vision correction lenses in our ultra-thin waveguides solve the weight issue and enable system efficiencies that go well beyond comfort.
Ecosystems and AI Integration in AR Displays
HP
On the ecosystem side, the integration of AI with extended reality (XR) marked a major advancement. This merger enables more engaging and contextually relevant experiences for users. A prime example of this is HP’s initiative to bring AI capabilities within its xRServices.
This AI-enhanced mixed-reality service environment aims to elevate the entire customer journey with AI-assisted processes such as ramp-up, production, diagnostics, issue resolution, and system availability.
The initial rollout of these capabilities is expected in 2025, promising to maximize overall performance and enhance user experiences as we mature the solution.
Weinstein
As developers and hardware manufacturers lay the groundwork of the AR ecosystem, it has become increasingly evident that users desire AR interactions that mimic real-world interactions – communicating through speech and gestures, as well as grabbing and manipulating objects.
Furthermore, given that AR devices will often be used in the real world, it is crucial that these devices are capable of sensing and, to some extent, understanding the world around the user – identifying people, objects, locations, and environments.
The core functionality for enabling these modes of interaction and context-understanding will come from Artificial Intelligence (AI). AI functions, such as speech-to-text, text-to-speech, real-time translation and transcription, and object recognition, are already available to AR developers and end-users via NVIDIA Inference Microservices (NIM).
Today, we’re seeing virtual assistants that can understand and respond to natural language questions and commands; we see training use cases where AI assistants guide specialists through complex tasks, resulting in deeper understanding and better outcomes. The AI powering these experiences is built on large language models (LLMs) and visual language models (VLMs) that are typically many gigabytes in size.
Bringing together lightweight AR devices with large AI models is made possible through inference running on edge servers.
NIM microservices are available from the major Cloud Service Providers (CSPs), making it easy for AR developers to leverage voice, gesture, and more into their AR applications. With ubiquitous access and limitless AI compute from the edge, the sky’s the limit for how AI will enhance AR.
Bussiness Hardware Optimisations
Vuzix
In addition, Vuzix unveiled its Incognito technology, which virtually eliminates forward-facing external eye glow. Current systems leak forward light, making the presence of a waveguide unmistakable and distracting to others.
In highly sensitive applications, this glow can present a security risk and endanger lives. Vuzix Incognito overcomes this critical barrier, rendering the use of a waveguide discreet and virtually invisible to others.
Both of these solutions can be produced for OEM clients in high volume and at a competitive cost by Vuzix in our USA-based waveguide facility.
Baumgart
In 2024, a significant advancement in augmented display technology is the integration of smart glasses with specialized external devices, enhancing real-time problem-solving for frontline workers in a more immersive manner.
By incorporating accessories such as thermal imaging modules and inspection tools, workers can access essential data and diagnostics on-site, which boosts situational awareness and operational efficiency.
This trend is especially vital in operational technology (OT), where swiftly connecting frontline workers to remote experts is crucial for minimizing downtime.
The capacity to provide immediate visual feedback allows these AR systems to lessen the reliance on equipment shipments or specialist visits, ensuring quicker resolutions for complex issues.
Furthermore, the increasing implementation of 5G and IoT integration amplifies the real-time functionalities of these devices.
The synergy of wearable technology and external devices significantly enhances productivity, especially in sectors like manufacturing and field services, addressing operational challenges with hands-free, data-driven solutions. This represents a pivotal advancement in utilizing AR to tackle real-world problems within demanding environments.