Refresh
These things are less likely to show up at Meta Connect 2025, but are still a possibility:
- A third-party Horizon OS headset from ASUS or Lenovo gets a launch date
- Meta shows off its latest VR headset prototypes with next-gen specs
- A new AAA VR title gets announced, similar to Batman or Deadpool
To recap everything I’ve covered so far, here’s what we fully expect to be covered at Meta Connect 2025 this year:
- An in-depth look at the new Meta Hypernova, or Celeste, smart glasses with a built-in display
- The Ray-Ban Meta 3 smart glasses, demoing more advanced multimodal AI capabilities
- Meta will talk up its Oakley smart glasses, which just launched
- Meta should announce that it’s opening its smart glasses OS to developers
- New features and demos for Meta AI and/or Llama 4
- New Meta Horizon announcements
Meta Connect 2025 starts today at 8 p.m. Eastern, while folks on the west (best) coast can watch at a more reasonable 5 p.m. Are you getting excited for the event?
I (Michael) will be driving down to Menlo Park, California, today for the keynote, so uber-VR nerd Nick will be taking over the blog in a bit to handle things while I get settled at the event venue.
Feel free to hang out here with us as we cover our final thoughts about the event, or bookmark the page and come back tonight to follow along!
That wraps up our second day of pre-Connect 2025 coverage! Hopefully you have a pretty clear idea of what to expect from the event.
But we’re not done! We’ll keep covering Meta’s Connect plans right up to and through the keynote, so come back soon to hear our live thoughts on every Connect 2025 reveal.
We’ve gone over almost everything that Meta will, might, or won’t cover at Connect 2025 this year. My last guess is that Meta might ape Google — which always invites a celebrity on stage or in prerecorded skits to drum up excitement — and have one of its athletic partners show up wearing the new Oakley glasses.
Likewise, Meta will reportedly spend $65 billion total on AI upgrades in 2025, so it’s an absolute certainty that Zuckerberg will trot out numbers on how much smarter Meta AI has become and how it beats competitors like Gemini and ChatGPT-5.
We may see more info on current AI tricks like personalized Meta AI avatars, but most likely we’ll see unannounced features, both for the mobile Meta AI app and for Ray-Ban/ Oakley glasses.
Most of the above info is educated guesswork. But I would bet any amount of money that Zuckerberg will spend at least a few minutes talking about Horizon Worlds.
He’ll bring up any major changes coming, plug the Creator program for developers, mention a couple of viral success stories, and emphasize that Horizon Worlds is available on mobile, not just VR.
Could we see an appearance from James Cameron? The legendary filmmaker has teamed up with Meta to make “world-class 3D entertainment experiences spanning live sports and concerts, feature films, and TV series featuring big-name IP on Meta Quest,” and he’s a huge fan of VR films in general.
We could see Meta showcase a big-name film IP on stage as a Quest exclusive to drum up interest, as it does with AAA VR games like Batman: Arkham Shadow and Deadpool VR. It also recently revamped its TV app to focus more on pro 3D content.
But again, Meta may wait until after it announces “Puffin” for major 3D film reveals.
Another longshot possibility is that Meta will show off its new prototype Quest headsets, either during the keynote or the conference.
During SIGGRAPH 2025, Meta showed off a Boba 3 headset with a 180-degree field of view — the Quest 3 hits 110º — and a Tiramisu prototype that offers an insane 90 pixels per degree, good enough to deliver 20/20 vision with flying colors.
Meta will never sell these, but maybe it’ll want to show impatient VR fans that it’s still developing cutting-edge tech and wants to make VR that’s “indistinguishable from the physical world.”
Even if the Quest 4 or Puffin doesn’t show up, we could still see a new VR headset on stage.
Meta announced last year that it would open up the Horizon OS to other brands, with ASUS and Lenovo signing on as partners who could make their own VR headsets with Quest software and games. ASUS’s headset is supposedly codenamed Tarius and could have key upgrades like local display dimming, eye tracking, and extra RAM.
We’re expecting this gaming headset sooner rather than later, and while Meta isn’t obligated to give stage time to its partners, it may do so in order to make Horizon OS look successful.
We’re seeing a rise in popularity for XR smart glasses from brands like XREAL and Viture that plug into a Steam Deck, Switch, or phone so you can stream movies or game with a 100-inch virtual screen without having to buy a TV that size.
How will Puffin stack up? It’ll have the upside of independent gaming experiences, powered by the external puck, as well as superior FoV in VR mode for a theater-sized screen. But it’ll also be about an ounce heavier, and Meta will need plenty of 3D content to make it compelling.
Either way, we’ll have to wait until Connect 2026 to see Puffin.
Why would Meta fundamentally change its Quest 4 design? At GDC earlier this year, Chris Pruett, Meta’s Director of Games and head of Oculus Publishing, held a Quest panel where he explained that the next generation of Quest gamers would be “mainstream adults” for whom gaming is a “secondary pastime,” and who would rather have a headset that acts as a “really high-end TV.”
Most people don’t want to wear a one-pound headset unless it’s for gaming. For Meta to target this “30-something dads” demographic, it needed a lighter headset like Puffin that would fall into the same upper range of smart glasses, but with the option for immersive VR, not just mixed reality.
Post 26
In fact, the latest Quest 4 leak suggests Meta has canceled its two “Pismo” prototypes and won’t sell its next major Quest headset until 2027.
Instead, Meta has pivoted to focus on “Puffin,” an XR device with Horizon OS that’ll look closer to chunky glasses than a VR headset. It would weigh about the same as Meta Orion — less than 100g — and even use an external, wireless pocket puck like Orion to power things.
Allegedly, this new device will arrive in 2026. Whether it’s a “Quest” or something else, we probably won’t see it this year at Connect 2025, unfortunately.
We’ve talked a lot about smart glasses, but what about VR and Quest headsets? Meta revealed the Quest 3 and Quest 3S during the last two Connect keynotes; could we see a Quest 4 this year?
I’ll be truly shocked if the answer is yes. Last year, Meta leaks suggested that we’ll see a Quest 4 and 4S in 2026, making Connect 2025 a premature time to reveal anything.
How Meta prices its new Ray-Ban smart glasses will impact their success. The current models start at a reasonable $299, while the Oakley Meta glasses start at $399. My guess is that inflation and new tech will bring the price up to $399, or higher with prescription or transitions lenses.
We may not see these new glasses in 2025, so close to the Oakley release, but I think Zuckerberg could still choose to tease the new Ray-Bans rather than wait another year. They’re Meta’s most successful product since the Quest 2, and he won’t want to lose momentum on making smart glasses more ubiquitous.
We can look at Meta Oakleys vs. the current Meta Ray-Bans for insight into other changes Meta could make. The most obvious one is that these new Ray-Bans should upgrade from 1080p to 3K cameras with better image stabilization, plus smoother videos on the move.
Whether Meta pulls off “Super Sensing” or not, it will undoubtedly improve on the current four-hour estimate for Ray-Ban battery life. Oakleys hit eight hours with “typical use,” and the Ray-Ban 3’s should at least match that.
I assume the only way Meta can pull off more efficient AI is with an upgraded processor. Most likely, it would use custom silicon that’s capable of on-device AI, with enough parameters and tokens to handle things locally without a phone or internet connection.
The Snapdragon AR1+ Gen 1 chip announced earlier this year offers an example of this: It’s capable of running Meta Llama 3.2 with one billion parameters and 128K token context, and of interpreting your surroundings via cameras with a mere 1.2-second delay, much faster than cloud AI.
I also assume this chip is too overpriced for the affordable Ray-Ban series — Meta might even use it for Ray-Ban Meta Display glasses, instead — but it’s an example of the type of upgrade Meta would need to make AI less of a battery-killer.
One additional image showing all the new Meta smart glasses styles seems to include a potential new Optical version of Oakley Meta HSTN smart glasses. The transparent frames on the bottom-left side look quite similar to standard (non-Meta) Oakley HSTN Optical glasses.
Rumor has it that Meta will be splitting the third-generation Ray-Ban Meta Smart Glasses into separate Optical and Sunglasses product lines, so it would make sense for the company to launch additional styles of its other frames with this same naming convention. It’s thought that Optical versions of Meta smart glasses will house additional prescription options, as the current iterations are a bit limited in that regard.
In addition to new Ray-Ban Meta Display smart glasses, we’re also getting a pair of Oakley Meta Sphaera smart glasses! This screenshot comes from the leaked unlisted promotional video from Meta’s own YouTube channel, so there’s little likelihood that these images are incorrect or fake.
Back in early June, I made a mock-up of Oakley Meta Sphaera smart glasses based on descriptions from a leak. Turns out, the final product looks almost identical, sporting the classic Oakley Sphaera visor look with a camera centered in the nose bridge. We don’t have any specs on these…ahem…specs, but we can probably assume they’ll sport ones nearly identical to Oakley Meta HSTN smart glasses that just launched last month.
Based on this short section of the promotional video, Ray-Ban Meta Display smart glasses look incredibly sleek and not at all like the chunky frames of Meta Orion when I tried used last year. That’s a great thing for anyone looking to pick up a pair of impressively functional smart glasses without worrying about looking out of place in public.
Not pictured here is the sEMG gesture band expected to ship with every pair of Ray-Ban Meta Display smart glasses. This gesture band (or bracelet, if you prefer) helps deliver accurate hand tracking input even when the glasses’ cameras can’t see your hand. That’s great for public use since it would look a bit strange acting like Tom Cruise in Minority Report without someone being able to see your glasses’ display.
The display appears to be embedded in the center of the right lens based on the leaked promotional video from Meta’s channel. Previous leaks told us the display would be in the right lens but would be framed closer to the bottom half for visibility reasons. Clearly, something changed during development and this was decided to be a better location.
In the video, Meta shows off Meta AI’s capability to visually identify plants and give recipe suggestions, as well as the live translation capabilities, overlaying an English translation on a French sign. Lastly, turn-by-turn directions are presented on the transparent display so you can look forward instead of having to peer down at your phone.
An unlisted video from Meta’s official YouTube channel was uncovered by someone, revealing what appears to be the entire new smart glasses lineup Meta is set to reveal on stage tomorrow.
First up is Ray-Ban Meta Display smart glasses, which is seemingly the final product name for Meta Hypernova. The frames carry the iconic Ray-Ban appearance, along with cameras embedded on the outside edge of each lens. The frames themselves appear a bit thicker than existing Ray-Ban Meta Smart Glasses, but not offensively so.
This is the first time we’ve seen the frames in a high-quality video sporting the Ray-Ban branding, making this a huge deal of a leak.
An earlier report from The Information goes into more detail on Meta’s Super Sensing AI plans for AI glasses. Meta AI would “see what its user does and respond in real-time, for hours,” remembering details and recognizing faces while draining much less battery life than the current-gen glasses lose from AI commands.
If the Ray-Ban Meta 3s get this tech, it would be a significant smarts upgrade. But The Information’s report claims we won’t see these glasses until 2026. Meta could still choose to tease them on stage, however, as it did with Orion last year.
You can see “Bellini” above, showing a distinct design to go with the tinted lenses.
According to this report, the third-generation Ray-Ban Meta smart glasses will have improved battery life and smarter on-device AI for things like object detection and scene recognition.
It also claimed that these new glasses would launch by the end of 2025, making a Connect 2025 appearance likely.
Meta Connect 2025 starts tomorrow evening, but we still have plenty of ground left to cover. Let’s talk Ray-Bans!
That image above is an (alleged) official render of the Ray-Ban Meta 3 glasses, with a new design. According to the report, Meta will sell the prescription-lens variant above (Aperol) and a sunglasses version (Bellini) with a different style.
The 2nd-gen Ray-Ban Metas have multiple lens types for each style, so this would be a notable shift.
As you can see, Meta has a lot of ground to cover, and we’ve only talked about the Hypernova smart glasses! Next, I’ll focus more on the prototypes we may see at Meta Connect 2025, from the likely (Ray-Ban Meta Gen 3) to the unlikely (Meta Quest 4).
Our last question may be more important than all the technical ones: Will Meta Celeste have alternate colors, lenses, and designs?
My guess is “no” to all three points; Meta may offer prescription lenses and larger or smaller head sizes, but otherwise, an expensive device like this will probably stick to a single black color and style, with no sunglasses or transition lens option. But that may disappoint anyone used to the more stylish Ray-Ban or Oakley smart glasses.
Will Hypernova offer an alternative option for left-eye dominant users?
Monocular smart glasses have obvious advantages over binocular glasses — they’re lighter, thinner, and cheaper — but Meta CTO Andrew Bosworth has explained the challenges of “binocular rivalry,” where “one eye is seeing something, the other eye isn’t seeing it, and your brain has to reconcile that.”
Some people may not like having a singular display, whichever lens it’s in. But some people will prefer having it match their dominant eye, and since about 3 in 10 people are left-eye dominant, Meta should cater to them, too.
Back on the topic of the display, what resolution and brightness will the display hit, and how big will it be?
Meta can’t make its display too large or centered without obstructing your vision, which could be dangerous for outdoor use. But Meta will also need to make its postage stamp-sized display large and clear enough that you can read notifications or Meta AI replies, particularly in sunny outdoor weather.
On a similar note, how heavy and thick will these glasses be?
Meta’s smart glasses weigh about 50g or 1.77oz, very reasonable but still notably heavier than normal glasses. Its Orion prototype weighs twice as much (98g), and many XR glasses like the XREAL One (83g) or Viture Luma Pro (80g) are on the heavy side.
We can assume the monocular Celeste will weigh less, based on the leaked render, but it may still be slightly too heavy to be comfortable, or too thick to properly blend in as normal glasses.
That’s what we “know” about Meta Hypernova, aka Celeste, from leaks. But there remain plenty of questions, starting with battery life.
Ray-Ban Meta glasses last about four hours per charge, but lose capacity over time, while Oakley Meta glasses last up to eight hours. But those estimates fall dramatically when you use Meta AI frequently, and there’s no display tech to worry about.
How long will Hypernova glasses last with active use? Meta needs to impress people with the answer before they spend $800+ on them, as there’s no easy way to recharge them on the go. They’ll just become heavy glasses that people with prescriptions can’t take off.
Meta’s AI glasses will reportedly cost $800. According to the report, Meta originally targeted a $1,000 price point, but lowered the price by $200 to make them more appealing to consumers.
The company only expects to sell about 200,000 glasses in two years, well below the millions of Ray-Ban AI glasses sold in the past few years. There are concerns about the “slightly heavier and thicker” build turning off consumers who like the more stylish Ray-Ban and Oakley models; it’s a necessary side effect of fitting display tech and larger batteries in the glasses, however.
Meta’s glasses sound quite similar to Google and Samsung’s Android XR prototype glasses, with built-in support for Google apps like Maps, Messages, Calendar, and Gemini visible in a monocular display. Of course, Meta will actually sell its glasses, while Google developed these glasses as a reference design for developers and a proof of concept for the media.
We do expect Samsung to release consumer XR glasses eventually, but it may start with display-free smart glasses in 2026, competing against Ray-Ban and Oakley Meta glasses. So Meta Celeste shouldn’t have much direct competition for a while.
Staying on the topic of Snap Specs for now, Jason England also got to play popular VR fitness rhythm game Synth Riders on the Snap Specs. It’s an impressive showcase of tech power given that VR headsets like the Meta Quest 3 pack a ton of additional processing power inside, yet Snap’s smart glasses can run some of the same games.
Synth Riders can be thought of as a “groovy Beat Saber” where you’re essentially dancing with your hands to match lines that appear in your view. It’s less slicing and more grooving, which gives the game its unique hook and a big community, plus lots of DLC support.
Bringing us back to Meta Connect, we’re fully expecting to see games make their way to Meta Hypernova smart glasses. One leak is for a game called Hypertrail that reportedly looks like arcade Atari classic Galaga. Given the size of the display and the fact that it’s only in one lens, we don’t expect gaming to be a major app category on Hypernova, but that won’t stop plenty of game developers from testing the waters!
Snap (yes, the Snapchat folks) is looking to get ahead of Meta this week by announcing Snap OS 2.0 for its soon-to-be consumer-ready smart glasses. These chunky frames have been around since last year (I saw someone at Meta Connect 2024 wearing them), but the operating system they ran was pretty early.
That changes drastically with the latest update, and our sister site, Tom’s Guide, just got a hands-on with them. In short, these operate similarly to Meta Orion with displays in both lenses, full spatial computing capabilities, and cameras that give you a unique look at the world around.
Jason England, who is wearing Snap Specs in the photo above, was able to ask the Specs how to do an ollie on the skateboard and was given a visual guide on where to place his feet and what to do with step-by-step instructions. It’s an impressive view of how spatial computing can enhance your normal Google search prompts and how having displays on your face can free up your hands to do more.
What exactly will you do with Meta Celeste glasses? According to leaks, the built-in display will default to showing six app icons, including Meta AI, WhatsApp, Facebook Messenger, a camera app, and a maps app. You’ll theoretically be able to see in-depth conversations, camera controls, or turn-by-turn directions floating in your vision.
We can also assume they’ll have the same Meta AI features as Ray-Ban or Oakley AI glasses, but that you’ll be able to see text responses in the display, instead of simply hearing them.
Meta has also publicly discussed its Reality Labs work on custom silicon and chips, and it seems highly likely that Hypernova will use a custom chip to power the display, UI, and on-device multimodal AI processing.
Unlike Meta Orion, which has thick frames and an external puck to power things, Meta Hypernova will have a smaller design with less room for heat dissipation and battery capacity. It’ll be interesting to see how Meta balances power and efficiency — and how much processing is done on your phone, or in the cloud.
Meta’s Hypernova display could theoretically use silicon carbide waveguides, which Meta chose for its Meta Orion AR glasses because it’s a “total game changer” for visual clarity.
Normal glass lenses create a rainbow “disco” of color and must be so thick to work that glasses become “prohibitively large and ugly,” Meta’s engineers say.
This new display tech reduces “ghost images,” has a wider FoV with better light spread, and better thermal conductivity. But it’s also prohibitively expensive, so I don’t know if Meta will find an alternative for Hypernova or eat the cost.
An sEMG band (surface electromyography) registers the electrical impulses from your hand and wrist muscles to your brain. If you make a pinching gesture to select something on a HUD display, the sEMG band will interpret the motion immediately, whereas camera-based gesture tracking will sometimes miss the input.
This band also creates haptic feedback, so if you’re interacting with a floating UI, Meta can simulate the feeling of “touching” something in mid-air with vibrations.
What are the Meta Hypernova glasses, you may ask? According to several credible leaks, Meta will sell monocular smart HUD glasses, codenamed “Hypernova” but possibly named “Meta Celeste.”
Like other Meta glasses, they’ll have built-in cameras for photography and data processing, a mic for voice commands, and speakers for music and AI responses.
As for what’s new, these glasses will have one built-in display in the right lens and Android-powered computational hardware built into the frames. Both Meta AI voice commands and gesture controls, tracked by the sEMG band pictured above, will control the device.
This year, we can expect Zuckerberg to showcase new updates for the Meta AI app, Llama 4, and Horizon Worlds, as well as whatever new AI features Meta has cooked up for its Ray-Ban Meta and Oakley HSTN glasses. Last year’s demos were a bit unpolished with technical hiccups, so we’ll see if Meta can do better.
His major focus, however, should be on the Hypernova glasses, while the Ray-Ban Gen 3s could also make an appearance, even if they don’t launch until 2026.
At Meta Connect 2024, CEO Mark Zuckerberg gave a 45-minute keynote. He introduced the new Quest 3S and discussed Horizon Worlds for about 10 minutes, then spent twice as long demoing new Meta AI and Ray-Ban AI features like live translation, Live AI, or remembering where you parked. He closed out with another 10 minutes exhibiting the Meta Orion prototype and its unique features.
Other Connect developer sessions will focus on creating Meta Horizon worlds in the VR and mobile space, including new generative AI tools to automate the process and accessing Meta’s Creator Fund.
I’m personally curious about the session “Your Android app, now in VR: Meta Horizon OS integration.” Meta’s toughest XR rivals are Google and Samsung, which will offer Gemini and Android apps on Project Moohan when it launches in 2025. It makes sense that Meta would try to court Android devs, but it could be a struggle.
Anyone invited to attend Meta Connect in person will also get to attend Developer sessions. Of these, we’re particularly interested in the session on the “new developer toolkit from Meta to be announced September 18.”
While developers can access the open Meta Horizon OS for Quest headsets, Meta’s AI glasses remain closed (for now). Meta could open up its current and future smart glasses’ OS, so third-party devs can have their apps appear on the Hypernova display, or integrate with Meta AI for voice commands.
Opening up its system will help Meta compete with Google’s open Android XR platform for smart glasses and XR headsets.
Meta has already shared the Connect 2025 programming agenda. These are the highlights:
9/17 5p PST: Connect Keynote. “Join Mark Zuckerberg as he shares the latest on AI glasses and lays out Meta’s vision for artificial intelligence and the metaverse.“
9/18 10a PST: Developer Keynote. “Hear from executives across Meta on how our latest technologies are creating opportunities for developers to build new experiences for people.“
9/18 10:45a PST: The future of computing. “As Meta announces its latest line of new products and updates that take the next steps on the journey to the next computing platform, we look even further into the future. Join visionaries Michael Abrash and Richard Newcombe as they reveal the exciting future of glasses with contextual AI, and how Meta is poised to transform the future of computing.“
This agenda makes it clear that we’ll see “new products,” confirming what we already expected. You may only care about the keynote, but “The future of computing” could show smart glasses fans what Meta’s vision is for future hardware generations.