Big XR News from Sony, Microsoft, HTC VIVE, and NVIDIA

AWE 2024 proves to be a trove of announcements, as the XR ecosystem grows

4
XR Big News
Mixed RealityNews Analysis

Published: June 21, 2024

Rgreenerheadshot

Rory Greener

Following the hotly anticipated Apple WWDC showcase last week. AWE 2024 came hot off the heels of the event to inspire interest and investment into the AR/VR/MR and spatial computing space.

With the event came countless announcements. But alongside the event, the XR world was spinning, showcasing how innovative firms are leveraging immersive technology to gain new insight and results.

More events are coming as the year progresses. As the second half of 2024 rolls in, the market will inevitably change, a fact further cemented by the increasing number of firms entering the XR space.

Sony Provides Spatial Computing without a Headset Requirement

This week, Sony Electronics introduced new 3D visualization tools that allow users to see and interact with extended reality content without using smart glasses. These tools are made possible through the ELF-SR2 (27-inch) and ELF-SR1 (15.6-inch) Spatial Reality Displays, which are designed to support professional real-time 3D (RT3D) workflows.

This new display technology allows users to avoid intrusive VR headsets, allowing extended XR content use without the perceived hardware challenges. Users can now collaborate on RT3D workflows using Sony’s enhanced display solution, incorporating content creation, design, and visualization capabilities.

Furthermore, Sony has integrated its new spatial displays with the echo3D service, which enhances the user experience with a 3D digital asset management platform and dedicated social applications. This integration provides professional users a secure platform to manage, store, control, optimize, and share 3D content. For example, the echo3D integration can help users seamlessly import 3D assets and prepare for various workflows, including storage, delivery, and analysis.

Microsoft Mesh to Gain Mac Support, AI Integrations, and Content Sharing Features

Microsoft announced new avatar integrations and usability enhancements this week for its recently launched Microsoft Mesh platform at AWE 2024. The new features are designed to optimize enterprise-facing avatar content creation on the Teams remote communication platform.

To improve virtual collaboration on Teams and Mesh, Microsoft will include new AI integrations that optimize and refine professional avatar creation. Feedback from Mesh adopters revealed an underwhelming response to the current avatar system and its ability to reflect a user’s professional image. However, with the AI integration, Mesh can now scan a photo of a user and appropriately design an avatar based on their features and clothing. The AI algorithms can virtually replicate a user’s face shape, hair, eyewear, and facial hair, ready for workers to refine to suit their intended look perfectly.

Moreover, Microsoft plans to introduce new emoticons, giving workers a simple and uniform way to engage with a digital meeting via a reactions panel. To improve avatar-based communications overall, Mesh will soon gain improved content-sharing features that enhance the functionality of external files, such as a PowerPoint presentation, within an immersive space. This includes the integration of new user camera angles, allowing workers to observe immersive space from new angles (first person, third person, and a wide-angle viewpoint).

To enhance virtual events on Mesh, Microsoft is introducing new features, including enhancements to Mesh’s event customization abilities, allowing users to create bespoke immersive workspaces easily. Microsoft also adds turnkey event templates that give Mesh users simple, customizable support for large-scale immersive events.

Finally, Microsoft revealed that the new and improved Mesh platform will debut for Mac in late June. Mesh-ready workforces can collaborate via various end devices, from a PC to a Mac to a Meta Quest headset.

82% of Financial Sector Businesses to Adopt XR in the Next 5 Years, HTC VIVE Report Finds

HTC VIVE’s report highlights how many leading firms in the financial service industry are increasingly using XR technology and experiencing positive ROI returns over a two-year period. The “XR Applications in the Financial Industry” survey, conducted independently, questioned around 400 financial services professionals. The report shows that roughly 92 per cent of financial professionals have invested in XR with favourable ROI. Additionally, 82% of surveyed believe their firms will adopt XR technology in the next five years.

The report also reveals that 84 per cent of financial professionals believe integrating XR into training programs has positively impacted their workforce’s skill development. Furthermore, 80 per cent of those surveyed also believe immersive technology increases general operational efficiency.

On the customer experience side, 77 per cent of the professionals believe that XR is improving the buyer’s journey. Marketing firms increasingly use XR as an engaging and fruitful opportunity to reach users. At AWE, Draw & Code and 302 Interactive are introducing a new immersive solution known as FanPort, an MR brand engagement platform, highlighting the increasing investment in market-facing XR solutions.

Also, 59 per cent of the surveyed professionals believe that XR technology’s ability to visualize data as 3D renders helps professionals in comprehension, decision-making processes, and client interactions. 59 per cent of financial professionals leveraging XR in this manner also state that the interaction increases engagement and interaction during financial education programs.

NVIDIA Uses Innovative Displays and AI to Solve XR Headset Form Factor Hurdles

NVIDIA recently worked with the Stanford Computational Imaging Group, led by Professor Gordon Wetzstein, to develop smaller, lighter XR devices.

NVIDIA observed that current XR headset designs rely on bulky optics, displays, and head straps, creating a social barrier hindering XR collaboration and workplace usage. Over the past few years, NVIDIA’s research teams have focused on solving these hardware-based challenges by creating smaller prototype devices.

As a result of their research, NVIDIA has developed a pair of prototype XR glasses that eliminate the need for larger devices commonly seen in VR headsets. These design choices are closer to AR smart glasses but with higher computing and display power. NVIDIA and the Stanford team achieved this by leveraging holographic near-eye displays, enabling the creation of a device that can display VR content on lenses just 2.5 mm thick.

Furthermore, the team used AI to enhance the XR display system and achieve its design goals. By employing an AI algorithm, NVIDIA improved the performance of lightweight lenses by reducing the interference of light sources. NVIDIA claims that its AI integration makes “practical holographic displays feasible” by reducing the size of components beyond traditional expectations.

Immersive CollaborationImmersive ExperienceWearables
Featured

Share This Post