This week, Meta announced the rollout of its AI framework across its Ray-Ban Meta AR-lite smart glasses in EU regions, following an initial reveal at Connect 2024.
Meta first debuted the features in the US, Canada, and Australia. However, recent news shows an urgency to introduce AI features to more customers in France, Italy, Ireland, and Spain.
Interestingly, this drive may come because, during Ray-Ban parent firm EssilorLuxottica’s recent Q3 2024 earnings call, the company noted that in EU regions, 60% of EU Ray-Ban stores mark the smart glasses as best-sellers.
EssilorLuxottica CFO Stefano Grassi said at the time:
On the frame side, I would say we continue to see strong demand from Ray-Ban Meta. It’s an overall success story that we see. Just to give you an idea, it’s not just a success in the U.S., where it’s obvious, but it’s also success in — success here in Europe. Just to give you an idea, in 60% of the Ray-Ban stores in Europe, in EMEA, Ray-Ban Meta is the best-seller in those stores. So it’s something that it’s extremely pleasing.
The Meta AI rollout enables EU customers to use voice controls to receive real-time information based on their surroundings. The rollout also means that the Meta AI integration on the consumer smart glasses will now support French, Italian, and Spanish language integration. Meta notes that following the current EU rollout, the firm will continue to debut the AI features in more countries in the region.
Meta to Leverage Augmented Reality Smart Glasses Data for Emerging Use Cases
The news of Ray-Ban Meta’s smart glasses AI rollout comes as Meta publicly experiments with how the emerging technology can create new use cases for enterprise sectors. Meta leads this research by working with a shortlist of universities to promote research towards Project Aria, a prototype AI-powered AR smart glasses device that also aligns with the Meta Orion project.
Like the Orion device, Project Aria aims to enhance research into everyday AR wearables and data-driven services. The Aria framework focuses on how AR environmental data can be applied across various fields, including robotics, healthcare, and automotive. Recent partnerships with universities aim to advance this company-wide research by collaborating with the research divisions of leading institutions that utilize the Aria Research Kit (ARK) to explore new forms of hands-free XR communication.
The University of Bristol collaborates with ARK to enhance its internal project, Ego-Exo4D. This project uses Aria sensors to create 3D models of real-world environments. Ego-Exo4D seeks to leverage Meta’s smart glasses to capture how highly skilled individuals, such as athletes and chefs, interact with their surroundings and incorporate that environmental data into a robotics framework.
Meanwhile, the University of Iowa is utilizing the Aria framework to develop healthcare services for individuals with hearing loss. Using the Aria environmental data, researchers at Iowa can create devices that translate the direction from which sounds originate.
Additionally, researchers at IIIT Hyderabad are working on the Driver Intent Prediction Project, which leverages computer vision to help prevent automotive accidents. Similar to the Bristol project, they aim to understand where drivers want to enhance their awareness of potential accidents by utilizing Aria gaze data.
Meta is also collaborating with Carnegie Mellon University’s Robotics Institute to support the development of Navcog, a healthcare application that provides audio navigation for visually impaired individuals. This partnership allows researchers to reduce their reliance on Bluetooth technology and utilize the Aria frameworks for better responsiveness to new environments.
This international university research partnership comes as Meta seeks to enhance the applicability of AR data for new uses beyond smart glasses hardware. Meta indicates that Aria has potential applications in research areas such as embodied AI, contextualized AI, human-computer interaction (HCI), and robotics.
Though the Aria project is still in its early stages—with the official website currently under development—Meta announced Aria at Connect 2020. As XR technology expands beyond hardware, valuable spatial data will lead to new use cases.
Orion, a Time-Machine of AR/AI Smart Glasses
At Meta Connect 2024, Mark Zuckerberg, CEO and Founder of Meta, introduced a prototype of the highly anticipated Orion device, a pair of AR smart glasses. During the event, Zuckerberg described the device as “the most advanced glasses in the world and our first prototype of full holographic AR.”
A key breakthrough Zuckerberg unveiled was the introduction of neural interface interaction. The Orion glasses enable users to interact through a wrist device that reads their brain’s electrical activity. Zuckerberg suggested that this technology could allow workers to type documents or control a cursor, stating, “Coupled with the glasses, this will enable a wide range of amazing use cases.” He emphasized that neural interfaces would be part of Meta’s roadmap and attract interest from other companies.
Currently, Meta has not disclosed full details about the device. However, they are launching Orion as a developer kit before a consumer release, collaborating internally and with select external partners to develop further and realize the futuristic prototype.
In 2024, it is becoming increasingly clear that Meta is focused on the future of AR smart glasses. Zuckerberg has expressed a strong belief in the potential of this technology, conveying optimism about it during SIGGRAPH this year.
Zuckerberg explained that the company is approaching the development of smart glasses from two angles. “On one hand, we’ve been building what we believe is the necessary technology for ideal holographic AR glasses; it’s truly impressive. We are developing custom silicon and a custom display stack, everything needed for functionality. These are glasses, not a headset. They resemble traditional glasses but are quite different from what you currently wear.”
In a strategic shift, Meta announced the discontinuation of Spark, its AR software development kit, to concentrate on advancing smart glasses. As a result, developers will no longer be able to create AR content using Spark on Meta’s smartphone social media platforms. However, AR content created by Meta will still be available on applications like Messenger, Instagram, and Facebook.
The decision to discontinue Meta Spark generated controversy; nevertheless, it appears to be part of Meta’s strategy to gain greater control over the distribution of AR content across its applications while focusing on future AR smart glasses development.
“We looked at what it would take to build great experiences in MR/AR; we realized that Spark was the wrong technical foundation for that type of work. So we gotta move to what we think is the right one to enable [that]”, remarked Andrew Bosworth, CTO.