Investigating five use case for AI in the Metaverse
Artificial intelligence (AI) applications are now much more common than you might think. In a recent McKinsey survey, 50% of respondents said that their companies use AI for at least one business function. A Deloitte report found that 40% of enterprises have an organisation-wide AI strategy in place.
In consumer-facing applications too, AI now plays a major role via facial recognition, natural language processing (NLP), faster computing, and all sorts of other under-the-hood processes.
It was only a matter of time until AI was applied to augmented and virtual reality to build smarter immersive worlds.
AI has the potential to parse huge volumes of data at lightning speed to generate insights and drive action. Users can either leverage AI for decision-making (which is the case for most enterprise applications), or link AI with automation for low touch processes.
The metaverse will use augmented and virtual reality (AR/VR) in combination with artificial intelligence and blockchain to create scalable and accurate virtual worlds.
The metaverse is defined as an expansive virtual space where users can interact with 3D digital objects and 3D virtual avatars of each other in a complex manner that mimics the real world.
The idea of the metaverse was first coined by science fiction writer Neal Stephenson in the early 90s and was eventually developed in parts by companies like Second Life, Decentraland, Microsoft, and most recently Meta (formerly Facebook).
Now, Facebook is well recognised for its work in artificial intelligence and sophisticated AI algorithms.
The company’s AI research spans diverse areas like content analysis, self-supervised speech processing, robotic interactions, computer vision, whole-body pose estimation, and much more.
All of this could inform the company’s future direction as Meta and drive the foundations of its own version of the metaverse.
While VR worlds can technically exist without artificial intelligence, the combination of the two unlocks a whole new degree of verisimilitude. This could impact the following five use cases:
Users are at the centre of the metaverse and your avatar’s accuracy will determine the quality of experience for you and other participants. An AI engine can analyse 2D user images or 3D scans to come up with a highly realistic simulated rendition.
It can then plot a variety of facial expressions, emotions, hairstyles, features brought on by aging, etc. to make the avatar more dynamic.
Companies like Ready Player Me are already using AI to help build avatars for the metaverse, and Meta is working on its own version of the technology.
Digital humans are 3D versions of chatbots that exist in the metaverse. They aren’t replicas of another person, really – instead, they are more like AI-enabled non-playing characters (NPCs) in a video game that can react and respond to your actions in a VR world.
Digital humans are built entirely using AI tech, and are essential to the landscape of the metaverse.
From NPCs in gameplay to automated assistants in VR workplaces, there are myriad applications, and companies like Unreal Engine and Soul Machines have already invested in this direction.
One of the primary ways digital humans use AI is for language processing.
Artificial intelligence can help break down natural languages like English, convert it into a machine-readable format, perform analysis, arrive at a response, convert the results back into English and send it to the user. This entire process takes a fraction of a second – just like a real conversation.
The best part is that the results could be converted into any language, depending on the AI’s training so that users from around the world can access the metaverse.
Here’s where AI really comes into its own. When an AI engine is fed with historical data, it learns from previous outputs and tries to come up with its own.
The output of AI will get better each time, with new input, human feedback, as well as machine learning reinforcement.
Eventually, the AI will be able to perform the task and provide output almost as well as human beings. Companies like NVIDIA are training AI to create entire virtual worlds.
This breakthrough will be instrumental in driving scalability for the metaverse, as new worlds can be added without the intervention of humans.
Finally, AI can also assist in human-computer interactions (HCI). When you put on a sophisticated, AI-enabled VR headset, its sensors will be able to read and predict your electrical and muscular patterns to know exactly how you’d want to move inside the metaverse.
AI can help recreate an authentic sense of touch in VR. It can also aid in voice-enabled navigation, so you can interact with virtual objects without having to use hand controllers.
It is important to keep in mind that the metaverse is a new area of research and operation, and AI implementation could run into issues. For example, there could be questions around:
Ultimately, without AI, it will be difficult to create an engaging, authentic, and scalable metaverse experience. That’s why companies like Meta are working closely with think tanks and ethics groups to stem the risks of AI without curbing the technology’s potential.