Just a week after Meta announced a range of education partnerships following the release of its XR teaching solution, the firm is yet again highlighting a series of alliances with researchers at learning institutions to push XR technologies in new and interesting ways.
The most recent partnership sees Meta work with a shortlist of universities to promote research towards Project Aria, an underdevelopment AI-powered AR smart glasses device that also aligns with the Meta Orion project.
Like the Orion device, Project Aria aims to further research into everyday AR wearables and AR-data-driven services. Currently, Meta is launching a to-market AR product via the Ray-Ban device. However, this product’s AR visualisation and AI integration are relatively basic compared to what Meta promises, with its future-looking prototype devices and frameworks offering deeper technology integrations and BCI features.
The Aria framework appears to focus on how AR environmental data can be applied to a range of cases in robotics, healthcare, and automotive. The recent university partnerships look to continue this company-wide research by collaborating with the research divisions of world-leading institutions that leverage the Aria Research Kit (ARK) to uncover new forms of hands-free XR communication.
Meta to Assist with Paradigm Shifting Research
Firstly, the University of Bristol is working with ARK to advance the team’s internal work on Ego-Exo4D. The project uses Aria sensors to create 3D models of real-world spaces.
Ego-Exo4D looks to leverage Meta smart glasses to capture the ways in which highly skilled individuals, such as sportspersons or chefs, interact with a space and put that environmental data into a robotics framework.
Dima Damen, Ph.D., professor of Computer Vision at the University of Bristol, explained:
I started the research lab focusing purely on understanding behavior from wearable cameras, from egocentric vision and technology that has multi sensors like area enable us to do deeper perception. Now, footage from egocentric cameras are becoming the norm to what we call pre train or prepare devices that then the robot understands by watching lots of recordings, they can now replicate the action successfully.
Damen noted that the Aria “gives an array of sensors like gaze, for example; it enables you to know where the person is in an environment.” This enables the researchers to model this human-centric data to create new dynamics for robotics projects and more.
On the other hand, the University of Iowa is leveraging the Aria framework to create healthcare services for people with hearing loss. Using the aforementioned Aria environmental data, Iowa researchers can create a device that can translate where a sound is coming from.
The ARK framework also helps IIIT Hyderabad researchers develop the Driver Intent Prediction Project. The project leverages computer vision for automotive accident prevention; by using Aria gaze data [like the Bristol research project], the IIIT Hyderabad researchers can understand where drivers are looking to improve awareness of motor accidents.
Avijit Dasgupta, PhD student of Computer Vision at IIIT Hyderabad, added:
With the help of Aria glass, we can develop a video understanding model that can help us in avoiding accidents. The main advantage of Aria is the eye gaze data from eye gaze data. You can map the scene where the driver is looking at whether they are paying attention to the cars that are there in front of them or looking at the mirrors while taking a turn.
Dasgupta also noted that the Aria kit provides audio feedback, “that’s how all the sensors can work together” therefore making data collection “very easy.”
“That has significantly improved the speed of our work; every year, millions die out of accidents. If we can build an efficient model for this, we can save millions of lives,” Dasgupta remarked.
Additionally, Meta is working with Carnegie Mellon University’s Robotics Institute to assist in the development of Navcog, a healthcare application for smartphones that provides audio wayfinding for visually impaired individuals.
The collaboration enables the researchers to reduce dependence on Bluetooth and leverage Aria frameworks to respond to new online environments more efficiently.
The international university research partnership comes as Meta looks to boost the transferability of AR data to new use cases outside of smart glasses hardware.
Notably, Meta adds that Aria has applications across research topics such as embodied AI, contextualized AI, human-computer interaction (HCI), and robotics.
The Aria project is very much early in its lifespan, with the official website in its infancy, despite Meta announcing Aria at Connect 2020. As XR technology finds increased presence outside of hardware, namely in applying valuable spatial data, increased use cases will also arrive.
Meta Brings VR/MR Headsets and the Metaverse to US/UK Education Institutions
Meta has also announced the launch of the Meta for Education beta program, a new product designed for colleges and universities that was first teased in April. This solution aims to provide education-specific XR (extended reality) services to teachers, trainers, and administrators through the Quest portfolio of VR and MR headsets.
The beta version is being rolled out in colleges and universities across the US and the UK, including Arizona State University, Houston Community College, Imperial College London, Miami Dade College, Morehouse College, New Mexico State University, San Diego State University, Savannah College of Art and Design, the University of Glasgow, University of Iowa, University of Leeds, University of Miami, and University of Michigan.
Participating universities will collaborate with Meta by providing feedback on the beta launch to help refine and improve the Meta for Education product before a wider release. This product is intended to support various subjects, such as science, medicine, history, and language arts.
Like the Meta Quest for Business model launched last year, the new educational solutions aim to empower teachers, trainers, and administrators with XR applications and management features designed explicitly for the academic environment.
The updated management features allow teachers to use multiple Quest devices simultaneously in a classroom setting. Additionally, advanced device management tools eliminate the need for teachers to update each device individually, streamlining the onboarding process.
Interestingly, Meta emphasized that the education package includes Metaverse capabilities, potentially revitalizing its once highly hyped technology promise. Meta explained how these Metaverse solutions can enhance classrooms through digital field trips and training opportunities.
Meta calls the early results “promising,” notably Monica Arés, Executive Director of Imperial IDEA Lab, Imperial College London, also stated that “this moment is greater than any one institution or one company need to come together in collaboration across the creators, the developers, educational institutions, research organizations, and tech companies to build this new learning ecosystem because it’s going to benefit every individual and industry.”
After its initial launch and feedback period, Meta plans to scale its solution in the final weeks of 2024. The company focuses on education and training applications for the Quest portfolio. This follows the recent launch of Meta Quest 3S during the Connect event a few months ago, a device designed to serve as an accessible entry point for integrating mixed reality (MR) across various sectors.