In May 2024, the Google I/O event revealed a few interesting insights into the tech giant’s future product roadmap. The company mainly highlighted AI, introducing new versions of Google Gemini, tools for building AI assistants, and more. With Project Astra, Google introduced its vision for the future of smart assistants.
According to the company, Astra is an “AI agent” for everyday life that can leverage your phone’s camera and voice recognition software to aid you with day-to-day tasks. However, what really made the concept of Project Astra so exciting (at least for extended reality fans) was the idea that it could power a new set of Google smart glasses.
At the event itself, Google even showed the assistant in use on a pair of prototype smart glasses – but didn’t reveal much information about an impending set of specs. Now, however, we have a lot more information about Google’s XR roadmap. Here’s what you need to know about Project Astra, and the potential smart glasses Google is currently testing in real-world scenarios.
What is Project Astra? Google’s Universal Assistant
Project Astra, introduced at Google I/O in 2024, is a research prototype Google is using to create the “ultimate” AI assistant for everyday users. Astra represents just one of many artificial intelligence announcements from Google in the last year. During I/O 2024, Google also introduced some new versions of Gemini, like Google Gemini 1.5 Flash.
They also showcased a new AI video generation tool (Google Veo) which is now rolling out to developers on platforms like Vertex AI. At a glance, with Project Astra, Google didn’t seem to be introducing anything particularly new. Plenty of other companies have been experimenting with AI assistants – such as Microsoft and Copilot, for instance.
However, even as just an AI solution, Astra has some unique benefits – starting with its multimodal capabilities. Astra can understand and respond to the world just like a human being. It can “see” images, and listen to human voice, meaning people can interact with the bot in various different ways.
Perhaps most impactful, however, was how Google showed people interacting with Astra. Not only could people access the assistant through a smartphone, but some beta testers at the Google I/O event showed Astra in action on a pair of prototype smart glasses. This led to a lot of industry analysts theorizing that Google could be working on its own version of the Meta Ray-Ban specs – another pair of smart glasses with a built in AI assistant.
Project Astra: Google Delays and Updates
Originally, Google promised that people would be able to start experimenting with Astra through the Google Gemini app sometime in 2024. However, during a Q3 earnings call, Sundar Pichai (Google’s CEO) said that the release timeline has been moved to 2025.
Pichai didn’t share any specific information about why the timeline has been moved. He simply said that the company is working on making the solution more intuitive, and that it hopes to start delivering Astra experiences to users in 2025. Google’s team have already shared a few insights into how they’re improving the Astra experience.
In December 2025, the company said that Astra is now more effective at understanding multiple languages, like French and Tamil, can mix those languages, and can understand accents. The assistant can also access Google apps, like Maps, Lens, and Search, and has ten minutes of in-session memory for more intuitive conversations.
Plus, with new streaming and audio understanding capabilities, Astra can now understand language at the same speed as a human conversation.
But updates to the AI capabilities of Astra aren’t the only things that Google has revealed lately. The company also introduced the world to the Android XR operating system in December – a brand new ecosystem solution specially designed to power XR devices (like headsets and smart glasses).
This operating system will be responsible for powering the AI and XR capabilities on Google’s own glasses, which could indicate that we’re closer to a new set of Google smart glasses than we thought. The question is – will Google create these glasses, or will they simply develop the software and development kits partners need to power third-party specs?
Project Astra: What We Know about the Prototype Glasses
We still don’t have much information about what Google’s smart glasses will look like, what they might do and if they’ll ever be released to the public. However, we do now know that Google is testing prototype versions of Project Astra glasses, with the Android XR operating system.
In a press briefing before the launch of Gemini 2.0, product manager on the Google DeepMind team, Bibo Xu, said that a small group will be testing project Astra on smart glasses. This group will be made up of members of Google’s Trusted Tester program.
The glasses are likely to be similar to the ones seen at the I/O event in 2024. As mentioned above, they’ll feature the new Android XR operating system – which will also be responsible for powering Samsung’s XR headset (Project Moohan), set to be released next year.
Google even shared a demo video showing how its prototype glasses can use augmented reality (through Android XR) and Project Astra to complete various tasks. For instance, the prototype glasses can translate posters or signs a user might be looking at. They can even help you find things you’ve misplaced around your house or allow you to read texts without looking at your phone.
When Will Google Release its New Smart Glasses?
Notably, while Google is definitely testing Project Astra smart glasses – that doesn’t necessarily mean we’re actually going to see a new set of specs from the company. Google definitely seems excited about the concept of AR smart glasses. Even the Project Astra website features a section highlighting how the AI assistant can power smart spectacles.
Creating a pair of smart glasses for its Astra assistant does make a lot of sense. Google’s own product development team has said that glasses are the perfect “hands-free wearable” for users looking to interact with artificial intelligence on the move.
However, Google’s trusted testers have experimented with plenty of the company’s prototypes that have never been released to the public. By testing Project Astra, Google might just be generating more insights into how it can create better solutions for third parties who want to build new specs using their technology.
After all, Android XR, the new Google operating system, isn’t just there to help Google enter the XR space. It’s an open-source solution intended to give countless partners (including Samsung) access to innovative AI and AR development tools.
Google’s spokespeople have already told various publications that the company has no timeline for a consumer launch of the Project Astra prototype glasses. We don’t have any real information about the technology in the specs (beyond the operating system they’re using), or their features.
Looking Ahead with Google and AR
The fact that Google is testing Project Astra glasses with a select few users, and building out its XR ecosystem with Android XR is exciting. However, it’s still uncertain whether Google is actually going to build and release a pair of augmented reality glasses itself.
Google has said that smart glasses and headsets are the “next generation of computing”, and smart glasses in particular have earned a lot of attention in the last year. However, while Meta’s Orion prototype glasses seem to powering Meta’s specific journey for AR hardware development, Google’s Astra prototypes could serve a very different purpose.
There’s a good chance Google is using these glasses for research purposes without any intention of building any hardware itself. Astra could simply form a component of the comprehensive Android XR ecosystem that Google is building to support third-party partners.
If Google’s Project Astra ever does make its way into smart glasses, we might see it emerging in solutions created by Samsung and other companies first. After all, Samsung already seems to be working on a pair of smart glasses. It’s also one of the first companies partnering with Google to use the Android XR platform to develop its headset – Project Moohan.
It makes sense that Samsung would want to continue leveraging Google’s proprietary technology for its smart glasses too. For now, customers looking forward to a new “Google wearable” probably shouldn’t hold their breath. The good news is that Google definitely will be bringing its technology to the XR market – the company may simply be taking less of a “hands-on” role with hardware.