Next month, at AWS re:Invent 2024, long-standing Amazon Web Services partner Proto will debut the next evolution of its AI-powered immersive hologram technology. The event, which takes place between December 2 and 6, will allow attendees to witness the latest innovation between the firms following a year of success regarding its hologram technology.
At the AWS event, Proto is showcasing a hologram experience that allows visitors to observe a generative conversation between two immersive avatars via a 3D display that integrated AWS-powered AI generates on the fly without pre-scripted lines.
Proto notes the debut as a “milestone” in human-AI interaction, the firm is showcasing the project at re:Invent’s main expo hall allowing visitors to interact with immersive holograms of Dr. Swami Sivasubramanian, VP of AI and Data, and Nandini Ramani, VP of Monitoring and Observability.
The real-time AI-powered conversation allows the two avatars to have original dynamic conversations, therefore creating an original narrative between the characters. “This activation represents more than just a technological demonstration—it’s a glimpse into the future of human-AI interaction,” added Raffi Kryszek, Proto’s Chief Product and AI Officer.
Kryszek also said:
By combining Proto’s groundbreaking holographic technology with AWS’s advanced AI capabilities, we’re showcasing how multiple AI agents can collaborate autonomously while maintaining natural, engaging dialogue.
Moreover, the Proto solution supports multilingual capabilities for international scalability, dynamic personality adaptation, natural language processing, and voice synthesis.
Partnering with AWS allows Proto to leverage the firm’s deep technology portfolio and reach. It also allows Proto to leverage Amazon Bedrock features such as Anthropic’s Claude to enable dynamic conversations, as well as AWS SageMaker, Rekognition, Translate, Amazon Titan Image Generator, and Luma AI.
In addition to the AWS technology integrations, Proto is also leveraging HeyGen technology to enable real-time avatar facial expressions and Eleven Labs’ voice synthesis technology.
Following Proto’s upcoming showcase at re:Invent 2024, the firm notes that its avatar technology can be used in various enterprise spaces, such as education, healthcare, retail, and corporate training.
Edward Ginis, Proto’s Chief Technology Officer, explained:
The future of human-AI interaction will be increasingly collaborative and multimodal. Our presentation at re:Invent shows how we can create experiences that are not just technologically advanced, but genuinely engaging and personally meaningful.
The AI features seemingly allow Proto to scale its innovation for early enterprise adopters by supporting personalised integrations that are ready for various deployments.
Proto at MWC 2024
Earlier in the year, Proto and AWS presented hologram avatars at the Mobile World Congress (MWC) 2024 conference in Barcelona, Spain, demonstrating the Amazon Bedrock-powered system with a large-scale kiosk.
During the event, AWS and Proto Holograms showcased a live broadcast featuring AWS VP Matt Wood, who engaged in a discussion with attendees identified as MWC 2024. Wood could see and communicate with the MWC 2024 audience in real-time, facilitating a question-and-answer session.
AWS emphasized that this demonstration highlights its “next-generation partner ecosystems” by merging the skills of emerging technology platforms with content providers. This collaboration aims to drive the next wave of AI innovation. AWS’s Bedrock service allows Proto Holograms to select from various base avatar models, customize private data sets, and integrate additional AWS services.
Amazon and XR
The Mobile World Congress (MWC) and re:Invent events highlight AWS’s efforts to enhance its spatial computing and industrial Metaverse portfolio tailored for enterprise clients. This initiative includes integrated support for a wide range of business tools, thereby strengthening the company’s immersive offerings.
In February, AWS announced that developers working on XR and spatial computing could utilize its Amplify backend solution to improve user experiences and support the deployment of visionOS applications. According to AWS, developers can use Amplify to create Vision Pro services more efficiently, benefiting their XR applications through AWS APIs, which include login services and cloud storage for downloading 3D assets and querying associated metadata.
AWS also provides machine learning tools to ensure the smooth operation of extensive data sets on mobile XR devices. Additionally, it offers Identity and Federation services to secure user data and account functionalities.
In conjunction with Apple’s introduction of the Vision Pro operating system, AWS released its visionOS developer tools, giving its spatial computing developers an early advantage.
As part of its XR developer portfolio, AWS also offers partner solutions, including Mytaverse, SURREAL, and PREVU3D services. Amazon and AWS plan to significantly expand its industrial Metaverse offerings, potentially positioning the company as a market leader and leveraging its established reputation as a provider of digital workplace solutions.
Recently, Amazon announced plans to deploy its new custom smart glasses for delivery drivers. These glasses will utilize AR visualizations to assist drivers in navigating their routes, particularly for on-foot navigation to customer delivery points.
Internally known as “Amelia,” the smart glasses feature a small AR display, an outward-facing camera that confirms delivered packages, Alexa voice commands, and an integrated AI chatbot. Reports indicate that the glasses will provide AR turn-by-turn navigation visualizations, helping drivers walk toward buildings while pointing out obstacles to save time during deliveries.
The AR smart glasses are also designed to enable delivery workers to operate hands-free, eliminating the need for handheld GPS devices. By offering hands-free guidance visualizations, the glasses will allow workers to receive directions while carrying more packages quickly.
These smart glasses specifically target the last 100 yards—or final steps—of package delivery to customers, an area Amazon is keen to optimize. However, despite an encouraging start and vision, the development of the AR device is not guaranteed. Amazon has reportedly indicated that the smart glasses may face delays or could even be shelved if significant challenges arise.
Given Amazon’s high-stakes, high-performance culture surrounding deliveries, various issues could impede the prototype’s progress, including battery life, weight, distracting visuals, and the integration of real-world data. This news comes on the heels of OpenAI and Apple working towards their versions of smart glasses. Thus, even if Amazon’s internal device doesn’t succeed, the company may use third-party devices from various sources.