Google smart glasses just closer to real with huge Project Astra update
The universal AI agent gets better
While OpenAI is meting out its big AI announcements over 12 Days, Google has decided to drop the whole kit and kaboodle on us by announcing Gemini 2.0. And we just got another sneak peek at Google's new prototype smart glasses.
Google claims that it is entering the "agent era" with Gemini 2, revealing a plethora of AI agents and the ability to create your own AI agents. One of the more interesting agentic announcements revolves around Project Astra, which was first teased during Google I/O back in May where it stole the show.
Project Astra is meant to be a "universal AI agent" that works on tasks in everyday life via your phone's camera and voice recognition software. This will morph into smart glasses a la Meta's Ray Bans. Google CEO Sundar Pichai teased Astra in a video on X and Google posted a YouTube video showing Astra in action.
During I/O the company showed off potential Astra capabilities working via smart glasses. While the AI agent is supposed to come to phones first, the smart glasses teaser really showed how it might work on other platforms.
Google claimed that Astra could understand and respond to the world "like people do." As with many AI reveals this year, Astra was shown responding to and with conversational language. The model appeared to have impressive spatial understanding and video processing during the May demo.
Initially Astra was promised to be released before the end of the year. However, in October, during an earnings call, Google CEO Sundar Pichai revealed that Astra had been delayed into 2025. So, we weren't expecting much Astra news before the year was out.
As part of today's announcement, Google revealed several updates to the model:
Sign up now to get the best Black Friday deals!
Discover the hottest deals, best product picks and the latest tech news from our experts at Tom’s Guide.
- Better dialogue - Astra can understand multiple languages, like French and Tamil, and mix them, including understanding accents and "uncommon words"
- Google apps access - Astra can access Google Maps, Lens and Search to answer prompts
- Improved memory - The agent now has 10 minutes of in-session memory and can remember previous conversations to be more personalized
- Faster latency - Astra can "understand language at about the latency of human conversation" with new native audio understanding and streaming capabilities.
Google did not provide more of a timeline for when people might get to see Astra in the real world.
However, in Google's announcement post, the company revealed that its tester program is starting to evaluate Project Astra on prototype glasses. We wonder if those glasses are the new smart glasses from Samsung we expect to be announced in January considering the partnership with Google and Qualcomm for a new XR platform.
Those glasses aren't supposed to feature a camera, yet. The glasses featured in the above video appear to be something else. Meta may be in the lead with smart glasses but other companies are catching up like Xreal and its new X1-powered AR glasses.
Google added that it is working on bringing Astra to the Gemini app, and other form factors, hinting at more than just phones and glasses.
More from Tom's Guide
- Motion sickness while using Android phones could be a thing of the past — here’s why
- I used Google Gemini to streamline my grocery shopping and it saved me time and money — here's how
- Forget Google Docs — I just went hands-on with OpenAI Canvas and it's actually better
Scott Younker is the West Coast Reporter at Tom’s Guide. He covers all the lastest tech news. He’s been involved in tech since 2011 at various outlets and is on an ongoing hunt to build the easiest to use home media system. When not writing about the latest devices, you are more than welcome to discuss board games or disc golf with him.