Qualcomm AI Research to Boost Virtual and Augmented Reality Gesture Understanding

New data sets from the leading XR chipset manufacturing enables new opportunities human-computer interactions in VR/AR solutions

2
Qualcomm AI Research to Boost Virtual and Augmented Reality Gesture Understanding
Augmented RealityInsights

Published: February 11, 2025

Rgreenerheadshot

Rory Greener

Recently, Qualcomm AI Research made several strides in promoting the evolution of XR by promoting new AI data sets to improve human-computing understanding in various business use cases.

Specifically, the Qualcomm AI subgroup published various research datasets on common emerging technology use cases for businesses, including virtual and augmented reality applications.

Moreover, these VR/AR AI models’ publication covers various related emerging use cases, including industrial IoT, robotics, healthcare, and assistive technology.

These data sets help advance how XR devices and solutions recognize gestures, speech, or images. Notably, this can help advance features seen in virtual reality applications like tracking user movements and how AR smart glasses recognise a user’s surroundings.

Qualcomm notes that these datasets can help train machine learning and artificial intelligence algorithms, enabling such features in VR/AR products-as well as other emerging technologies such as robotics and smart home products.

How Does Qualcomm AI Research Boost XR?

Qualcomm AI Research models are looking to solve various challenges in advancing the ways in which technology, not just environmental computer understanding for AR/VR solutions.

AR/VR-related models incldue AirLetters a dataset for assisting AI in indentifying and classifying articulated motions, such as humans drawing letters and digits in the air.

Simimerally, Qualcomm’s Jester Dataset provides computers with assistants for recognizing single-frame gestures, using a thumbs-up gesture as an example of the type of gesture the dataset can assist a solution in understanding.

The Something-Something v2 Dataset builds upon the Jester framework and aims to train machine-learning models to understand more complex human hand gestures.

Moreover, the Keyword Speech Dataset helps mobile and home devices understand keywords. Together with gesture understanding, these datasets could find a home in smart glasses or headset devices, numerous of which come with voice input and gesture understanding technology. Many also come with Qualcomm brand, XR-ready chipsets, a perfect combo.

It should also be noted that other Qualcomm AI Research models greatly assist in the ongoing development of robotics, an area of parallel and supportive growth for XR, as RT3D immersive hardware becomes increasingly commonplace in controlling and deploying robotic solutions through concepts like telepresence.

Developers are incredibly bright individuals, and these innovative experts’ use of tools such as Qualcomm AI Research will only create new opportunities for the XR market and its end users.

AIAR Smart GlassesAssisted Reality

Brands mentioned in this article.

Featured

Share This Post