I just tried Ray-Ban Meta’s latest AI updates — my favorite smart glasses just got a whole lot smarter

Ray-Ban Meta Smart glasses
(Image credit: Future)

Just over a week ago at Meta Connect 2024, the Zuck announced a bunch of new AI features coming to the Ray-Ban Meta Smart Glasses, alongside a limited edition transparent model that I immediately bought with no regrets. 

Up until this point, Meta AI’s non-existence here in the UK has meant these specs have been nothing more than a fancier-looking pair of Snapchat Spectacles — which gave me a rather interesting bargain-basement Jonathan Van Ness vibe. But not only is Meta AI now finally available in the UK and Australia, I got to try some of the newest features coming to the best smart glasses

And put simply, if you thought the multi-modal AI features were smart before, you ain’t seen nothing yet. These updates are a quiet revolution, which makes these less of a question + answer machine, and more of a legitimately helpful assistant.

Ray-Ban Meta smart glasses (with transition lenses): $379 @ Best Buy

Ray-Ban Meta smart glasses (with transition lenses): $379 @ Best Buy
These smart glasses far exceed their 12MP camera capture abilities. Meta AI features give them all the smarts to answer any questions about the world around you (thanks to multimodal AI). And all of this is underpinned by a stellar-looking set of specs that feel great to wear all-day round.

Actual conversation

Ray-Ban Meta Smart Glasses

(Image credit: Future)

In one of my big predictions for Connect 2024, I had my fingers crossed for more natural interactions with the Ray-Bans. In fact, I laid down the gauntlet for Meta to drop the need to say “Hey Meta, look and…” followed by your question. Well, not to say I’m the Mystic Meg of AI glasses (ask your British mates), but that’s exactly what happened.

The end result is probably the biggest feature update to the smart specs that you won’t even realize. Getting help to translate a menu, get recipe ideas based on ingredients in your kitchen, and finding out more about landmarks is so much more natural without needing to follow the standard prompting rubric.

There are still some moments where you’ll need to be more precise with your request. For example, when I held my Viking compass tattoo out in front of the camera and asked “Hey Meta, what does this tattoo mean,” the response was simply a dictionary definition of the word “tattoo.” Following up with the look prompt and I achieved a fleshed out description of the details and their significance.

But on the main hand, it works, and it works damn well.

Real-life recall

Ray-Ban Meta Smart Glasses

(Image credit: Future)

Ever see that episode of “Black Mirror” where everyone has that eye-based computer to be able to rewind to certain moments and conversations? Not only is that my fiancée’s dream gadget of the future to remind me when I said “yes” to doing that errand I forgot to do, but Meta’s taken a small step towards it with reminders.

There are your more simple reminders — stuff like “remind me to help find news stories for the team tomorrow at 8:30 a.m.,” which it does with a little notification nudge. But it really gets interesting with the vision element of things.

For example, asking it to remind me to buy this book in two weeks was a seamless process, where Meta AI took the information off the cover to provide that additional detail. Following on from this, if you have a vague memory of something, your Ray-Bans can recall that. I asked “what book am I supposed to buy in two weeks?” And its answer was straight to the point of “You were supposed to buy The Last of Us Part II book in 2 weeks.”

Ray-Ban Meta Smart Glasses

(Image credit: Future)

There are limitations, though. Visual reminders are simply to remind you to do something with the thing you capture. There’s no option to pin a location on an item, which seems to be a common misconception of the whole “remember where I parked” prompt demo at Connect 2024. 

This would be another huge step forward if location data could be used alongside visual data — your glasses nudging you on what direction to walk in to find whatever it is you asked them to remember.

Also, shout-out to being able to ask Meta AI to scan a QR code and send the link to my phone. Most of the time, it’s probably a link you want to open there and then (maybe a gig poster to buy tickets), so you’re better off using your phone. But in the times where you just want to bookmark something to view later, this was immensely helpful.

Outlook

Ray-Ban Meta Smart glasses

(Image credit: Future)

Unfortunately, not all of the features I wanted to try were there. Live voice translation will not be available until later this year, and I wasn’t able to call a phone number I saw in front of me. But based on what I’ve tested so far (and what I look forward to using more in my day-to-day), Ray-Ban Meta Smart Glasses are just that — smart. 

Not only that, but they’re getting smarter with version 186. These are legitimately the best piece of AI hardware you can buy right now. And while there are still some kinks to work out in terms of misunderstanding requests, the new features make it a legitimately helpful companion in your gadget arsenal.

More from Tom's Guide

Jason England
Managing Editor — Computing

Jason brings a decade of tech and gaming journalism experience to his role as a Managing Editor of Computing at Tom's Guide. He has previously written for Laptop Mag, Tom's Hardware, Kotaku, Stuff and BBC Science Focus. In his spare time, you'll find Jason looking for good dogs to pet or thinking about eating pizza if he isn't already.

  • Sizzlenasty
    Too bad they only make these META glasses in children & woman's sizes! They need to make 58mm + sizes for Men! I know Mark Zuckerberg is the size of a children and all, but come on man... You want to sell these to Adult men or not?
    Reply
  • Pbaz
    I love the glasses, but don't yet have confidence in the AI. For example, I asked how long it would take to run a half-marathon at a specific minute-per-kilometre pace. The answer it gave began with "At a pace of x minutes per kilometre..." but it used miles in its calculation instead,, so the answer it gave was off by about an hour. While they're great to use, especially for hands-free photos on trail runs, it has a long way to go in providing information you can count on.
    Reply