Google showcased for the first time a prototype of next-generation smart glasses based on the Android XR platform. Taking the stage at the TED2025 conference, Android XR leader Shahram Izadi and product manager Nishtha Bhatia shared the features of the device with the participants in real time.
Google introduces Android XR smart glasses prototype
The device, which resembles an ordinary pair of glasses with its design, is differentiated with its high-resolution screen, microphone, speaker, mini camera and prescription glass support. The glasses, which can work integrated with smartphones, are optimised for daily use.
In the live demo at the event, the glasses were used with the Gemini artificial intelligence assistant. Examples such as Bhatia printing haiku with voice command, reminding a book he had seen before and finding the hotel key card revealed the device’s visual memory and natural language processing capabilities.
Another feature that expanded the user experience was augmented reality supported navigation. By integrating 3D maps directly into the field of view, the classic navigation concept has moved to a different dimension. The glasses can also recognise music albums, play tracks and analyse visual content such as diagrams.
The basis of this hardware and software infrastructure is the Android XR platform, which Google introduced last December in cooperation with Samsung and Qualcomm. Popular applications such as YouTube, Chrome, Google Maps and Google Photos have been integrated into this platform and adapted to XR features.
The new glasses developed by Google are considered as the beginning of a new era in artificial intelligence-supported wearable technologies. So what do you think about this issue? You can share your opinions with us in the comments section below.