What is Meta’s Interaction SDK? Smarter Controller Gestures for Meta VR

XR Today Team

Streamlining virtual reality user input and navigation

Meta Insights
What is Meta’s Interaction SDK? Smarter Controller Gestures for Meta VR

Hand controllers and ease of navigation are among the most crucial aspects of a VR experience. It determines how seamlessly users can navigate the virtual world.

It removes or adds blockers to the immersive-ness of VR, and ultimately, could be that make-or-break factor for long-term engagement.

For instance, poor hand-based control and navigation could deter users from trying out advanced collaboration apps and games in VR and stay limited to only content consumption.

In a recent survey, a sizable 12.5% of VR users were not confident about the intuitive nature of controls and 12.5% even reported signs of discomfort and nausea.

That’s why Meta is making the effort to streamline VR gestures and controls with the next generation of immersive reality devices.

It has announced that it will roll out Interaction SDK, which will equip developers with the tools necessary to provide an intuitive, standardised, consistent, and extensible gesture-based control experience.

Meta made the announcement on October 28 as part of its slew of announcements at Connect 2021. Here’s all you need to know about Meta’s Interaction SDK and the surrounding ecosystem.

Defining Interaction SDK

Interaction SDK can be defined as a VR interaction framework that includes pre-built hands and controller-centric interactions, standardised patterns, data protections, and an integration toolkit to help developers build VR experiences (particularly for gameplay).

Interaction SDK is part of Meta’s newly launched Presence Platform.

Using the Presence Platform, developers will be able to bridge the gap between physical and virtual realities so they can “seamlessly blend virtual content in a user’s physical world.”

In other words, the Presence Platform offers tools for both mixed reality and virtual reality, aiming to create a rich immersive experience that will eventually culminate in the metaverse.

When Facebook rebranded as Meta and announced its intention to pivot towards the metaverse from social media, decentralisation was a top priority.

The Presence Platform invites independent developers and companies to leverage Meta’s core machine perception and AI technologies for VR apps.

The Presence Platform includes three software development kits to help jumpstart VR development and Interaction SDK is one of its main components.

What Are the Key Features of Interaction SDK?

The Interaction SDK empowers developers with the following capabilities:

  • Ready-to-use interaction components – Essentially, the SDK is a set of prebuilt interaction components that are ready for use. Meta has announced that the first release will include interaction components like grab, poke, target, and select, and this list will grow with future releases.
  • Custom gesture support – An important feature of the SDK is that it goes beyond prebuilt interactions and templates. The kit has all the tooling you would need to build your own unique gestures from scratch, further extending the SDK’s capabilities.
  • Standardised interaction patterns – Meta has mentioned in the announcement that standardised interaction patterns will be included in the SDK. In other words, you may use each interaction component on a standalone basis, or together through pre-set patterns. For instance, target and grab can be used together to pick up objects in VR gameplay, while target and select could be a handy gesture when collaborating in VR.
  • Protected by Meta’s data privacy rules – Facebook had already defined a strict set of privacy guidelines for hand-tracking capabilities in Quest, and the same principles apply to the new Interaction SDK. Meta’s software will analyse hand images in real-time, during VR interactions – but the company uses a generic hand model so that the data is anonymised. Images or data estimates specific to the user’s hands aren’t stored on Meta’s servers, and developers can use this data only for hand-tracking and nothing else.
  • Integration-ready with other technologies – interaction SDK supports a great deal of integration flexibility. You can pick and select specific components for use. You could choose one or all components, or even build your own. And, you may integrate the SDK with other interaction frameworks available from VR, gaming technology, and related providers.

Benefits of Using Interaction SDK

While there are other interaction frameworks to choose from and always a from-scratch option, there are several reasons to opt for this SDK. You can:

1. Freely build your gesture combinations

As mentioned, developers enjoy an enormous amount of flexibility when choosing how they use the SDK’s different interaction components. You can come up with your own combinations and create brand new gestures and interaction patterns.

There are standardised patterns to save time, and you could use one gesture component from Meta together with another from a different framework.

The tooling also allows you to customise interaction components for an infinite number of combinations.

2. Save efforts when modeling computer vision-based gestures

Meta’s Interaction SDK packages the company’s significant advancements in machine perception and AI into a convenient bundle.

This provides developers with a headstart when launching new VR experiences. As the “grunt work” is mostly done, you can focus on the user experience, gamification, and interaction design.

3. Save time by not handling regressions

Interaction SDK will be instrumental in reducing technical debt as there are new versions of your VR app.

Meta has announced that the SDK is designed to prevent regressions as hands and controller-centric interaction technology evolves over time.

This will go a long way in improving the business sustainability of VR apps, both gaming and non-gaming.

3 New Software Development Kits to Use with Interaction SDK

Along with Interaction SDK, Meta has launched three associated software development kits for VR initiatives. These are:

  • Insight SDK – A set of tools and templates to enable a realistic sense of presence in MR.
  • Tracked Keyboard SDK – A trackable virtual keyboard that lets you type in VR.
  • Voice SDK (experimental) – A set of natural language capabilities for hands-free MR and VR navigation.

When Can You Start Using Interaction SDK?

Interaction SDK is part of the Presence Platform and will be launched in its entirety next year, complete with documentation.

Meta has also announced that the Unity library for Interaction SDK will be available by early 2022.


Join our Weekly Newsletter