X
Tech
Why you can trust ZDNET : ZDNET independently tests and researches products to bring you our best recommendations and advice. When you buy through our links, we may earn a commission. Our process

'ZDNET Recommends': What exactly does it mean?

ZDNET's recommendations are based on many hours of testing, research, and comparison shopping. We gather data from the best available sources, including vendor and retailer listings as well as other relevant and independent reviews sites. And we pore over customer reviews to find out what matters to real people who already own and use the products and services we’re assessing.

When you click through from our site to a retailer and buy a product or service, we may earn affiliate commissions. This helps support our work, but does not affect what we cover or how, and it does not affect the price you pay. Neither ZDNET nor the author are compensated for these independent reviews. Indeed, we follow strict guidelines that ensure our editorial content is never influenced by advertisers.

ZDNET's editorial team writes on behalf of you, our reader. Our goal is to deliver the most accurate information and the most knowledgeable advice possible in order to help you make smarter buying decisions on tech gear and a wide array of products and services. Our editors thoroughly review and fact-check every article to ensure that our content meets the highest standards. If we have made an error or published misleading information, we will correct or clarify the article. If you see inaccuracies in our content, please report the mistake via this form.

Close

Meta just gave its $299 smart glasses their biggest AI upgrade yet, and I'm beyond excited

The newest Ray-Ban smart glasses, powered by Meta AI, can now process your surroundings and answer environment-based questions.
Written by Kerry Wan, Senior Reviews Editor
A person holding up the Meta Ray Ban smart glasses
June Wan/ZDNET

When Meta first launched its Ray-Ban smart glasses, there was one feature that I was excited to try but couldn't. The promise of a multimodal AI device capable of answering questions based on what the user was staring at sounded like a dream wearable, but Meta wouldn't be rolling out that functionality to its $299 smart glasses until "next year." That idolized future may be closer than I anticipated.

Also: Meta's $299 Ray-Ban smart glasses may be the most useful gadget I've tested all year

Today, the company is launching an early access program that will allow Ray-Ban Meta smart glasses users to test the new multimodal AI features, all of which leverage the onboard camera and microphones to process environmental data and provide contextual information such as what a user is staring at.

How it all works is rather straightforward. You start a Meta AI prompt by saying, "Hey Meta, take a look at this," followed by the specifics. For example, "Hey Meta, take a look at this plate of food and tell me what ingredients were used." To answer the question, the glasses capture an image of what's in front of you and then break down the various subjects and elements with generative AI

The functionality goes beyond the usual "What is this building?" or "What's the weather like today?" prompts, of course, as Meta CEO, Mark Zuckerberg, demoed in an Instagram Reel. In the video, Zuckerberg asks Meta AI, "Look and tell me what pants to wear with this shirt." as he holds up a rainbow-striped button-down. Not only does the voice assistant identify the apparel, but it suggests pairing it with dark-washed jeans or solid-colored trousers. (The real question is do tech CEOs actually wear outfits beyond the monochromatic t-shirts and dark-colored pants.)

(Side note: Up until today, Meta AI on the Ray-Ban glasses had a knowledge cutoff of December 2022. According to Meta CTO Andrew Bosworth, they now have access to real-time info thanks to Bing.)  

Also: Meta rolls out its AI-powered image generator as a dedicated website

Only a small batch of users will receive the new update at first, as Meta plans to collect feedback and refine its upcoming AI features before the official releases. To participate, update the Meta View app to the latest version, tap on the gear icon in the bottom right of the menu bar, swipe down to "Early Access," and tap "Join Early Access." 

I'm not seeing anything resembling an early access program on my Android and iOS apps, but you can bet that when the update comes along, I'll be quick to download and start testing it -- because what was already one of the most useful tech gadgets I tested in 2023 is about to become even more useful.

Editorial standards