Meta Connect 2025 started out on Wednesday night with Mark Zuckerberg donning Meta’s latest pair of Ray-Ban smart glasses and then switched to “Mark’s POV” as we saw his view from the lens display. You could feel the confidence in his step as he walked to deliver the keynote, fist-bumping Diplo along the way. The event ended less than an hour later with Zuckerberg’s shoulders sunk after the latest Meta Ray-Bans failed every live demo.

From Meta’s Menlo Park campus, Zuckerberg revealed three pairs of Meta smart glasses that were already accidentally unveiled on its YouTube channel earlier this week.

The irony of AI proving comedy is not dead

Meta Ray-Ban smart glasses in black

Source: Meta

First up was the latest generation Meta Ray-Ban, still following the iconic Wayfarer design. Zuckerberg said they now have double the battery life and 3k video recording. They’re $379 and on sale right now.

One innovation is an accessibility-friendly feature called Conversation Focus that will come to all existing Ray-Ban Metas with a software update. Conversation Focus amplifies the voice of whoever the wearer is speaking to in a noisy space.

Zuckerberg played a video of street style photographer Johnny Cirillo of Watching New York demonstrating how it works. The preternaturally cool Cirillo strolled up to someone on a busy New York City street and to tune in to their conversation he said, “Hey, Meta, start Conversation Focus” and the words of his companion grew louder as the street sounds faded into the background.

There was no such smoothness when things shifted to a live demo of a new Live AI feature. Zuckerberg said Live AI is designed to provide AI assistance in nearly every aspect of the lives of those who wear Meta’s Ray-Bans. He did hedge and say that the feature is “not there yet” in that it cannot run all the time, but he said it currently can be used for an hour or two straight.

LiveAI demo fails on the first prompt at Meta Connect 2025. #Meta #AI #LiveAI
Shacknews (@shacknews.com) 2025-09-18T00:34:42.409Z

To show off its current capabilities, Zuckerberg chatted with a chef on a screen and asked him to use it to make a Korean steak sauce with the ingredients he had set out before them.

“Hey, Meta, start Live AI,” the chef said and then asked it to help create the condiment. Live AI drifted off into a description of what is generally in the recipe and the nervous-looking chef bounced from foot to foot, and asked, “What do I do first?”

After a very long pause, Live AI said, “You’ve already combined the base ingredients.” The chef tried his request again and Live AI, like a culinary HAL 9000 repeated, “You’ve already combined the base ingredients.” The chef quickly tried blaming the Wi-Fi, a cue Zuckerberg picked up on and repeated over and over again as he grew visibly anxious.

Sports saves the day

Oakley Meta Vanguard in color Prizm

Source: Meta

Zuckerberg moved on to more stable territory with a pair of glasses that did not rely on a live demo, the new Oakley Meta Vanguard. The shield-style glasses are designed for performance.

Zuckerberg said that someone can run two marathons on them and not be out of battery. The camera is centered for shot alignment, there’s a 122-degree field of view, and video capture is 3k with slow-motion and hyperlapse features that will roll out to all of Meta’s smart glasses. There’s also wind noise reduction for calls so that surfing shouldn’t stand in the way of handling meetings. New partnerships with Garmin and Strava let users capture and share with their respective communities. Zuckerberg said the Oakleys are the most water-resistant glasses Meta makes with an IP67 rating. He said he has taken them out surfing.

The Oakley Meta Vanguard sells for $499. It can be pre-ordered now and ships on Oct. 21.

It’s all in the wrist

pair of gray Meta Ray Ban Displays with gray wristband

Source: Meta

Last up was Meta Ray-Ban Display, the glasses Zuckerberg strutted to the stage in. They have the look of a chunkier Wayfarer and feature new technologies, including a high-resolution display on the right lens and a whole new way to interact with glasses, a wristband called the Meta Neural Band. He said the set is $799 and will be available only in stores starting Sept. 30.

Zuckerberg looked proud again as he hyped the glasses display that’s large enough to watch a video or a few messages and disappears when not in use. What’s on screen is sharp, with 42 pixels per degree, and bright, at 5,000 nits.

But Zuckerberg really wanted to show off what he was wearing on his wrist, the Neural Band that he said is “a huge scientific breakthrough” that has the ability to work with what’s on the display through barely perceptible movements. He said it has 18 hours of battery life and is water-resistant.

And then came the moment Zuckerberg undoubtedly looked forward to for perhaps years but now seemed to dread. “We’ve got the slides,” he said and paused. “And the live demo.” The audience cheered the words “live demo” like the prospect of lions in the Colosseum and in a shaky voice, Zuckerberg said, “Let’s do it live.”

The demo kicked off with a notification from Meta Chief Technology Officer Andrew Bosworth. “Boz is messaging me,” Zuckerberg said. He then typed back a message with a few subtle taps on a surface and told the audience that he is now up to 30 words a minute. In texts they agree on a video call.

A loud ring echoed through the room and Zuckerberg repeatedly tried to answer it. Finally, he said to the audience, “That’s too bad. I don’t know what happened.”He tried again and again. “This is uh. It happens,” he said. “Let’s go for a fourth.” The ringing continued. And continued. “Five times,” he said. “I don’t know what to tell you guys.”

Boz then emerged from backstage where they decided to show off how the glasses could essentially put subtitles on in-person conversations. After a glitchy start, the feature kicked in and transcribed everything Zuckerberg was saying onto the display of Boz’s glasses. Boz said the glasses could also handle live language translation in text, but there was no demonstration.

Zuckerberg then took a few photos of Boz to show how the display feature lets people preview shots before they take them and go through them after.

To end things, a pre-taped video of friends bumping into each other played to show how Live AI on Ray-Ban Displays could be used agentically but naturally to follow up on everything they discussed. With Ray-Ban Display, you will never be able to say “let’s have coffee” to that person you ran into and have things just end there.

All kidding aside

While the demos failed, reviews of Meta’s smart glasses are out there and they are largely positive. The Verge Senior Reviewer, Wearable Tech Victoria Song titled her review of the Meta Ray-Ban Display, “I regret to inform you that Meta’s smart glasses are the best I’ve ever tried.”

There is tremendous value for many in the accessibility features that Meta offers with its standard Ray-Ban model and the Display. Conversation Focus and transcription are a huge help for those with hearing impairment or those who have difficulty focusing. And Live AI could presumably assist those with visual impairment in navigating, interpreting, and interacting with the world around them.

Meta’s smart glasses are currently the most popular of their kind and these latest offerings look like they’ll cement that status.