![]() |
The prototype of AI-integrated smart glasses is called Project Aura by Google. Photo: Xreal . |
According to the blog post, Google officially announced two types of smart glasses integrating two separate AIs to compete with Meta in 2026, including a version with a screen and another type of glasses that focuses solely on audio.
Specifically, during a demo at its New York office, Google showcased several AI glasses prototypes, along with an early test model developed in collaboration with Xreal, codenamed Project Aura.
Similar to Meta's popular Ray-Bans, Google's smart glasses will connect wirelessly to smartphones and rely on the phone to handle demanding tasks. Letting the phone handle most of the work allows the glasses to be thin and lightweight, much like regular eyeglasses.
Another highlight is that the glasses use Google's Gemini AI assistant to process requests, such as playing music from YouTube Music or analyzing ingredients in front of the user to suggest recipes.
In terms of display, the smart glasses prototypes being tested include a single-lens type, with a screen integrated into the right lens. Another prototype is a dual-lens type with two screens.
Both models support augmented reality (AR) overlays for apps like Google Maps and Google Meet. The dual-label design reportedly provides a larger virtual screen.
The first AI glasses produced in collaboration with Google are expected to launch in 2026. Samsung, Warby Parker, and Gentle Monster are among the company's initial hardware partners, although the final design has yet to be revealed.
These new products, along with Google's Android XR operating system, represent a more refined and calculated approach to smart glasses compared to Google Glass.
This was once a groundbreaking product, but it failed with consumers a decade ago due to its bizarre design, poor battery life, and privacy concerns.
Source: https://znews.vn/google-dua-ai-len-mat-kinh-post1609646.html







Comment (0)