Using XR to Manage and Collaborate Over 2D/3D Data Formats

Kevin ‘O Donovan and Jennifer Rogers provide expert insight into understanding and defining XR data types

Using XR to Manage and Collaborate Over 2D/3D Data Formats
Mixed RealityInsights

Published: April 29, 2024


Rory Greener

The industrial metaverse and its related technologies, such as digital twins, shared virtual collaboration tools, and smart glasses, are set to revolutionize business operations with immersive tools providing real-time, interactive, persistent, 3D, and synchronous collaboration between the working world.

Moreover, the enterprise XR space is not exclusive. Spatial/XR data is helping to inform a range of end hardware types, like robotics. Additionally, the vendors and technology innovators combine technologies such as AI and blockchain to enhance XR tools further.

Advancements in technology are driving the industrial revolution forward, enabling industries to solve increasingly complex problems digitally. This is breaking down traditional barriers and bringing software agility to physical operations.

Spatial data is a valuable asset that extends beyond headsets, encompassing a broad range of end devices with significant implications. In a recent conversation during the Big XR News Show, contributors Kevin ‘O Donovan, Co-Chair, Industrial Metaverse & Digital Twin Committee, VRARA and  Jennifer Rogers, Executive Officer, Learning Technology Standards Committee at the IEEE spoke on some of the innovations, challenges, and considerations regarding the sophisticated distribution of XR across industries.

Using XR to Manage and Collaborate over 2D/3D Data Formats

‘O Donovan noted that clients in the enterprise and industrial sectors deal with sensitive data. Moreover, industry data also move between 2D and 3D data sets from SAP systems to LIDAR Scans and 3D CAD cam models.

XR can help to unify and consolidate this data under software or hardware solutions. Moreover, XR vendors state that an immersive avenue for interacting with 2D and 3D data sets has various benefits, such as AR remote guidance and increased engagement in VR.

‘O Donovan noted:

What we’re now trying to do is, can we bunch all this together somehow? How I consume the content in some ways is irrelevant whether I’m looking at it on my 2D screen on my phone and I can spin the model around, or I’m on a 2D monitor, a 3D monitor, hologram, VR, AR or whatever. However, I’m consuming it. I’m collaborating with people in different places, and we’re all looking at the same data at the same time, and we can make decisions.

XR brings a wow factor that can drive interest in a device and how it could transform a business. However, as XR is still emerging, interacting with the technology is still a learning curve; enterprise adopters should not get wowed by a device and apply it to an inappropriate use case.

Understanding and Defining XR Data Types

Rogers also explained how the XR space and its industry adopters need “common agreements around metadata and how we’re coding those objects so that we can put them in the right people’s hands and the right way,” further noting how the metaverse label doesn’t just include VR applications but also AR ones as both platforms allow for remote immersive communications and collaboration.

It’s critical as we start to look at how we layer different experiences on top of the real world for people once again from a workforce enablement perspective. – From an IEEE LTSC perspective, where we look at the learning technology standards, we have some standards around learning object metadata, now called learning resource metadata. But even in that realm, we have to expand beyond that and collaborate with others because when does an object in the metaverse start and stop being a learning object?

“It is really important from a standards perspective that we’re all speaking the same language,” Rogers also noted. The IEEE representative continued by explaining:

It’s really important that people understand the structures around how we’re capturing this data and that there is transparency around what’s being stored and how it’s being stored because it’s hard to regulate things when there’s no common currency with regard to the ways in which we’re collecting data on the objects and then collecting data on the ways in which we’re enabling people in the workforce, particularly from an industrial metaverse standpoint.

It is important to consider learning object definitions and data collection as many high-quality enterprise XR assets and objects are triggered via and interact with a real business environment operated by a real worker. Therefore, the industry can work with those standardisations and differences when implementing XR services.

Rogers also added:

It’s important that we have a common way of looking at the objects and how people act upon the objects in this space. – For us to provide that seamless experience we’ve been talking about, transparency is key. We have to understand what data is being collected and where it’s being collected, as everyone else has said, and agree upon the standards or format in which we’re collecting that data. 

Moreover, as XR leaders come together to define and standardise the technology and its business applications, measuring outcomes to understand “how are these 3D collaboration tools better different than the 2D collaboration tools that we use today?” – Rogers remarked.

“To answer that question,” Rogers notes how important it is to analyse and measure what a business expects to see regarding “human behaviour in the collaborative environment.” Tracking and measuring this situation can help companies examine the efficacy of a particular type of XR interaction. 

The reason that we want to link the intent to something that can be measured or tracked is so that we can show the efficacy of the type of experience, make questions about which environments is XR appropriate for us and which ones, to Kevin’s point, does it seem cool, but it’s not necessarily bringing much value because until we can define that ROI, it’s going to be difficult for people to jump in and start to utilise these tools. It’s on us to communicate to them which intents are appropriate. 

Read more about enterprise XR in an extended chat with or featured XR New Show panellists: Jay Latta, Founder and Speaker, The Fusionists; Amy Peck, Founder and CEO, EndeavorXR; and Letitia Bochud, Director, Virtual Switzerland and Chair of the Board of Directors, XR4Europe.

Immersive CollaborationImmersive ExperienceIndustry 4.0MetaverseVR Headsets

Share This Post