At TED2025, Google confirmed off its refined good glasses on the HUD, however the firm described them as “conceptual {hardware}.”
Google’s Android XR lead, Shahram Izadi, appeared on the TED stage earlier this month, displaying off each her HUD glasses and Samsung’s upcoming XR headset, and her 15-minute speak was launched.
Tremendous reduce from the TED2025 demo.
The glasses include a digicam, microphone and audio system just like Ray-Ban Metaglass, but additionally “small decision of a full-color lens show.” The show appears to be like monocular, with the refracted gentle from the correct lens drawn when seen from a particular digicam angle in the course of the demo, with a comparatively small area of view.
The demo focuses on Google’s Gemini Dialog AI System, which incorporates the Venture ASTRA characteristic that remembers what may be seen by regularly encoding video frames, combining video and audio enter into the occasion timeline, and caching this data for environment friendly remembers.
Izadi and his colleague Nishtabatia exhibit on the demonstration.
- Primary multimodal: Batya asks Gemini to put in writing haiku based mostly on what she is watching as she watches the viewers and says, “All the aglow is dealing with. An enthusiastic thoughts awaits the phrases.
- Rolling Context Reminiscence: After trying away from the shelf containing objects containing books, Batya asks Gemini what the title is for “The White E book That Was on the Shelf Behind Me”, which solutions accurately. She then tries tougher questions and easily asks the place her “lodge key playing cards” are, with out giving them clues concerning the shelf. Gemini accurately replies that it’s on the correct facet of the music report.
- Advanced Multimodal: Opening the e book, Batia asks Gemini what the diagram means and solutions accurately.
- translation: Batia sees the Spanish signal and asks Gemini to translate it into English with out telling Gemini what language it’s. It’ll succeed. To show that the demo is reside, Izadi asks the viewers to decide on a unique language, somebody chooses Falsi, and Gemini additionally properly interprets the signal into Falsi.
- Multilingual Assist: Bhatia speaks to Gemini in Hindi and responds immediately in Hindi with out having to vary “mode” or “settings”.
- Take motion (music): For instance of how Gemini on glasses can set off actions on her telephone, Batia appears to be like on the bodily album she has and orders Gemini to play the monitor. It begins the music on her telephone and streams it to her glasses by way of Bluetooth.
- Navigation: Batia asks Gemini to “strikes to close by parks with ocean views.” As she strikes straight ahead, she is watching the 2D Flip By-by-Flip instruction, however trying down, you’ll be able to see a 3D (however fastened) minimap displaying the journey route.
Google teases AI good glasses with HUD in I/O 2024
Google teased multimodal AI good glasses on HUD with I/O 2024.
This isn’t the primary time Google has proven its good glasses on HUD, and never the primary time Demo has centered on Gemini’s Venture Astra characteristic. At Google I/O 2024 nearly a yr in the past, the corporate confirmed a brief pre-recorded demo of the expertise.
Final yr’s glasses had been considerably bigger than these proven in TED2025, however means that the corporate is actively working to miniaturize its merchandise with the objective of delivering it.
Nonetheless, Izadi nonetheless explains what Google is describing as “conceptual {hardware},” and the corporate has not introduced a timeline for any explicit product or product.
In October, Sylvia Varnham O’Regan of Info reported that Samsung was engaged on Ray-Ban Metagrass opponents with Google Gemini AI, however it’s unclear whether or not the product has HUDs.
Metaglass costs, options and enter units reportedly
The brand new Bloomberg report particulars Meta’s upcoming HUD glasses costs and options, claiming Meta’s neural wristbands can be within the field.

If it has a HUD, it is not alone out there. Along with the handfuls of startups that confirmed off their prototypes at CES, Mark Zuckerberg’s Meta is about to launch its personal good glasses at HUD later this yr.
Just like the glasses Google confirmed in TED2025, Meta glasses have a small show on the correct eye, and are reportedly centered on multimodal AI (within the case of Meta, the llama-driven meta AI).
Nonetheless, in contrast to Google’s glasses, it gave the impression to be primarily managed by voice, whereas Meta’s HUD glasses are reportedly controllable by way of finger gestures sensed by the included SEMG neural wristband.
Apple can be engaged on good glasses, and reportedly has clear plans to launch the product in 2027.
Apple explores the discharge of good glasses in 2027
Apple seems to be exploring the manufacturing of good glasses, and is reportedly capable of ship the product in 2027.

All three hope to construct on the preliminary success of Ray-Ban Metagrass, which has lately handed 2 million items of gross sales, and hope for a major enhance in manufacturing.
Count on the competitors for good glasses to be fierce within the coming years because the tech big fights to manage AI that sees what you see and listen to and has the power to challenge photos at any time.