This week, reports stemming from Reuters highlight how Amazon is employing AR smart glasses to optimise its delivery options.
Amazon is planning to deploy its new bespoke smart glasses for delivery drivers. The glasses will leverage AR visualisations to help guide drivers on their routes, seemingly for on-foot navigation towards a customer landing point.
Internally codenamed “Amelia,” the smartglasses use a small AR display, an internal outward-facing camera that confirms delivered packages, Alexa voice commands, and an integrated AI chatbot.
The report states that Amazon’s smart glasses will create AR turn-by-turn navigation visualisations. This includes walking towards buildings and pointing out obstacles in a bid to save time during deliveries.
Moreover, the AR smart glasses will aim to help delivery workers work hands-free by eliminating the need to use handheld global positioning system devices.
The Amazon smart glasses aim to enhance deliveries by allowing workers to receive hands-free guidance visualisation that quickly shows them directions while also allowing them to carry more packages.
Speaking to Reuters, an Amazon spokesperson added:
We are continuously innovating to create an even safer and better delivery experience for drivers. We otherwise don’t comment on our product roadmap.
What does the Future hold for Amazon Smart Glasses?
The AR smart glasses are explicitly targeted at the last 100 yards—or last steps—of a package to customer delivery, an area Amazon is broadly attempting to optimise.
However, despite a promising start and vision, the AR device is not cemented. Amazon reportadly states that the smart glasses may be delayed or even shelved if the hurdles arise.
With the high-stakes and high-performance culture around Amazon deliveries, many issues could halt the prototype device, including sufficient batteries, weight, distracting visuals, and accompanying real-world data.
The news comes after OpenAI and Apple worked towards a future of smart glasses. So, even if Amazon’s internal first-party device does not succeed, there is a chance that Amazon could leverage a third-party device from various sources.
Recently, news outlets have reported that technology leaders Apple and OpenAI are moving towards AR smart glasses, following a surge of interest in the immersive hardware sector in 2024.
Caitlin Kalinowski, Meta’s former Head of AR Glasses Hardware—who was involved with developing the Orion project—has made a significant career shift. She has left Zuckerberg’s company to join OpenAI’s technical staff, where her work will focus on robotics and consumer hardware.
Kalinowski mentioned that her professional transition aims to integrate robotics to bring “AI into the physical world,” which may indicate the development of an AR device. This news follows OpenAI’s hiring of former Apple iPhone leader Jony Ive last year, tasked with designing a new hardware product to enhance the company’s AI platform.
These critical changes at OpenAI suggest that the firm may conduct research and development on an AI-enhanced AR device or something similar.
AR smart glasses are increasingly seen as a consumer solution to enhance AI’s functionality in everyday life. The successful AR-lite Meta Ray-Ban smart glasses are an experimental platform for such technologies. Meta presented new AI use cases during Connect 2024 a few months ago.
Apple is the subject of ongoing speculation about its next move in extended reality. Most recently, Apple has been researching AR smart glasses, aligning with the trends of 2024. According to sources, Apple is working on an “atlas” project, which seeks feedback from Apple employees and competitors regarding AR smart glasses.
Although plenty of big names could potentially provide a third-party AR option for Amazon, the firm is also working on a second generation of its Echo Frames smart glasses product poised for 2026, which could intertwine with the Amelia project.