At Meta Connect 2024, Mark Zuckerberg, CEO and Founder of Meta, unveiled a prototype for the much-hyped Orion device, a pair of breakthrough AR smart glasses.
During the event, Zuckerberg called the device “the most advanced glasses in the world and our first prototype of full holographic AR.” Zuckerberg also unveiled a significant breakthrough: neural interface interaction. Orion allows, via a wrist device, users to interact with smart glasses using their brain’s electrical activity.
Following the reveal, CTO Andrew Bosworth dropped an interview explaining Orion’s journey; he stated:
We just revealed Orion, our AR glass prototype. Building full AR glasses like these with a wide field of view display, wireless AI built-in, running all the core experiences you’d expect, has been something we’ve been working on for nearly a decade now. When we first started this project, teams crunched the numbers, and they thought our chances of building one of these were 10%. So we overcame a lot of odds to build these glasses.
Developmental Hurdles
According to Bosworth, Orion’s display and the field of view “were the biggest challenges by far.” He also stated that to enable high-quality AR displays, Reality Labs engineers had to find ways to “bend beams of light in ways that frankly they don’t like to bend.”
Bosworth added:
We had to build an entirely new display architecture, so miniature projectors in the frame of the glasses shoot light into the wave guides with nanoscale 3D structures printed into the lens that diffract light, showing holograms at different depths and sizes in your environment. We’ve bent this light in a power envelope measured in milliwatts.
Bosworth explained that Meta achieved the feat by using silicon carbide materials for the display over glass, “it doesn’t create weird optical artifacts, or catch stray light, and it has a really high index of refraction, which is the key to a large field of view and an efficient use of photons.”
However, despite the abilities of silicon carbide components, the process of manufacturing the displays is “incredibly complex,” Bosworth explained.
Bosworth said that another challenge in developing Orion is device heat dissipation: ” You can’t stuff a fan into a pair of glasses, so the only way to get rid of heat is by radiating it away. But this is a really small device, and it’s right next to your head, so it can’t draw away massive amounts of heat.”
To create a heat-regulating device that doesn’t negatively affect form factor and wearability, Meta “chose magnesium, which is also used in spacecraft for the same reason,” Bosworth remarked.
Bosworth added:
The custom wireless stack is designed to work in burst mode to reduce the amount of heat generated, we have to build really specialized custom silicon. That’s extremely power efficient and optimized for our AI, machine perception, and graphics algorithms. In fact, we built over 10 custom chips and dozens of highly custom silicon IP blocks. This lets us take algorithms like hand and eye tracking, slam, that normally take hundreds of milliwatts of power. We’re using them down to just a few dozen milliwatts.
Form Factor and Design
When it comes to form factor, it appears Meta is pushing to make the Orion end-product as small as possible. This is a key factor in use and ubiquity, as a small form factor will be less intrusive for everyday and long-term usage.
“This is the largest field of view available in the smallest glasses form factor today,” noted Bosworth, who said that Orion is “just a pair of glasses that you’d be comfortable wearing.”
Within the small form factor, Meta is packing a powerful technology stack behind the device, including “optical alignment at the accuracy of 1/10th the width of a human hair,” 7 camera/sensors within the frame rims, and EMG/neural interface integration.
Speaking on the much-touted neural interface, Bosworth added:
We needed a way to interact with these devices as well, something you can do while you’re on the go and heads up. This is where EMG and gaze come in. The wristband detects neuromotor signals so you can click with small hand gestures while your hand is resting at your side and you can select an action just by looking at it, these are more natural, faster more comfortable and unobtrusive interactions.
Will Orion go Mainstream?
The overall goal of Orion is to lower the cost of production to a point where the AR smart glasses product can be affordable and accessible enough to push consumer adoption similar to smartphones.
While Orion is far away, Meta is working with great success on its Ray-Ban AR-lite smart glasses, which may easily be a test bed for certain features and technology found in the upcoming Orion device. For example, at Meta Connect 2024, Meta showcased new AI integrations for the Ray Ban smart glasses, including services like live translation.
“There have always been two major questions that needed to be answered to bring AR glasses to the mainstream,” stated Bosworth, who also noted that challenges include developing “the fundamental technological breakthroughs” and use cases.
Bosworth said:
What can you do on AR glasses that you can’t do on any other kind of device? We feel like we’ve made progress on both of those questions with Orion and the good news is, you’ll see a lot of new devices from us in the next few years that build upon this work. There’s still plenty to be done to bring this to consumers. We have to bring the cost down and make it easier to produce at scale. Right now it’s are still prohibitively expensive as a consumer product, but make no mistake, this is the future.
Optimism from the Meta team is high, with the impressive showcase perhaps an indicator of tomorrow’s technology. “It might be the most advanced consumer electronics device ever attempted,” proclaimed Bosworth.
He also noted that smart glasses will be the “most impactful” new technology since the smartphone. ” People have been dreaming about AR glasses like this for a long time, and the dream is finally coming true,” Bosworth added.