20 Sources
20 Sources
[1]
Meta Connect 2025: What to expect and how to watch | TechCrunch
Meta Connect 2025 -- the company's biggest conference of the year where it unveils smart glasses and VR headsets -- kicks off on Wednesday night. We're expecting the star of Meta Connect 2025 to be the company's new AI-powered smart glasses with Ray-Ban and Oakley, but the company may have some other surprises in store regarding the Metaverse, Quest headsets, or even its broader AI ambitions. Meta says it has sold millions of Ray-Ban Meta smart glasses, and earlier this year, Meta unveiled its latest AI-powered smart glasses with Oakley, which were designed for athletes. Silicon Valley is leaning heavily into AI wearables, and Meta seems to be one of the companies leading the charge. Notably, this is the company's first Connect conference since it started Meta Superintelligence Labs (MSL), its boldest effort yet to develop cutting-edge AI systems under former Scale AI CEO Alexandr Wang. It's possible we'll get some official updates on how that project is going, and we may hear from some MSL executives. As Meta looks to regain its footing in the AI race, this year's Connect feels especially consequential. Meta Connect 2025 starts at 5 p.m. PT Wednesday, with a keynote from CEO Mark Zuckerberg. The event will take place in person at Meta's headquarters in Menlo Park. You can register for free to watch the livestream virtually on Meta's website. Meta's agenda says the keynote will be roughly an hour. If you want to get that Menlo Park feel from the comfort of your living room, you can also access the keynote through Horizon via your Meta Quest headset. You can also access the Meta Connect 2025 keynote on Facebook via the company's official developer page, Meta for Developers. On Thursday, Meta will host a Developer Keynote staring 10 a.m. PT to discuss the new experiences people can build with its devices. Then, at 10:45 a.m. PT, Meta will host a conversation between Chief Scientist of Reality Labs Michael Abrash and VP of Reality Labs Research Richard Newcombe. The two Meta executives are slated to discuss the "future of glasses with contextual AI, and how Meta is poised to transform the future of computing." There have already been several leaks regarding what will be announced at Meta Connect 2025. Perhaps the biggest pertains to a new type of smart glasses called Hypernova. A now-removed video on Meta's YouTube channel, spotted by UploadVR, showed a pair of Meta Ray-Ban smart glasses with a heads-up display on the right lens, as well as cameras, microphones, and an onboard AI assistant. The glasses in the video were controlled by a wristband, which was unveiled at last year's Connect and is controlled by subtle hand gestures. The video suggests that Meta will unveil, and perhaps launch, the Hypernova glasses this week. CNBC previously reported that Meta was planning to unveil Hypernova and launch the wristband at Connect 2025. Meta also seems likely to unveil new pair of smart glasses it developed with Oakley at Connect 2025. The companies are expected to launch a new pair of AI-powered smart glasses in Oakley's Spheara style. This features a large unified lens on the front -- an ideal shape for runners and bikers. Unlike previous Meta smart glasses, this model has one centered camera above the nose bridge, rather than two cameras on the top corners of the frames. On the VR front, it's unclear whether Meta will release any new Quest headsets as part of this year's Connect. Even though the conference, and company, is named after the Metaverse, it seems like that's less of the focus this year. Meta is reportedly developing an ultralight VR headset for launch by the end of 2026. However, the company could wait to show that off at next year's Connect. That said, Meta promises that Zuckerberg will talk about the Metaverse in some shape or form. We don't doubt that. As for Meta's AI ambitions, it wouldn't be surprising if Zuckerberg used Connect 2025 as a chance to highlight all the work MSL is doing. The company's first LlamaCon, its AI developer conference, took place earlier this year before Meta invested billions in Scale AI and hired researchers from around the industry. Right now, Meta's standalone AI app is in a confusing spot where it both controls smart glasses and can be used as an AI chatbot. It's possible that app also gets some updates that make it easier to use.
[2]
All the news from Meta Connect 2025
Meta's annual Connect conference is happening on September 17th and 18th, and it seems like there's going to be a lot of focus on smart glasses. CEO Mark Zuckerberg's Wednesday evening keynote will discuss "the latest on AI glasses," the company says. Rumors and a video leak from Meta already indicate that we might see Meta launch glasses that have a display on one lens and that are controlled with a neural wristband, as well as wraparound Oakley smart glasses with a camera on the nose bridge. The company also says that Zuckerberg will be laying out "Meta's vision for artificial intelligence and the metaverse," so there should be some non-smart glasses news to look forward to, too.
[3]
New Meta Ray-Bans leak with 2 major upgrades - now I'm even more excited for Connect
A new Oakley model based on the brand's Sphaera design was also leaked. Did Meta just foil its biggest hardware announcement of 2025? Some would say yes, but it's not all doom and gloom in Menlo Park. Meta Connect this week is expected to be a pivotal moment for the AI wearables industry, as the makers of the popular Ray-Ban smart glasses take the stage to unveil a succeeding model -- and maybe more. Unfortunately for the tech giant, Wednesday's biggest surprises may have just leaked to the public. A now unlisted YouTube video, first discovered by UploadVR, showcased an unreleased pair of "Ray-Ban Display" glasses and another based on Oakley's Sphaera design. Also: What to expect from Meta Connect 2025 this week: Ray-Ban smart glasses, Hypernova, more The wearables look as good as we expected, and fall in line with past reports of Meta pushing for a more advanced, wrist-band-controlled pair of smart glasses. If the execution of the Ray-Ban Display is as good as the trailer makes it out to be, there's more reason to be excited for Wednesday's keynote than not to be. The current generation of Meta Ray-Bans has become one of the most transformative consumer products of the past two years. Content creators are building careers around the face-worn cameras, everyday users are using vision insurance to buy them, and even tech journalists (myself included) are reaching for these glasses over $1,000 cameras to capture video. But the now two-year-old smart glasses are also showing their age, with recording capabilities, battery life, and their feature set being outmatched by new competitors in Rokid, Google, and, soon, possibly Samsung. Here's how Meta is leveling things up in 2025. Also: 5 Meta Ray-Ban upgrades that have me truly hyped for September 17 First, the new Meta Ray-Ban Display glasses will feature a monocular HUD that projects navigation pathways, translations, messages, and exchanges with the Meta AI assistant. Based on the leaked video, the display is a static projection, meaning it won't magically anchor onto what's in front of you like an augmented reality headset would. This isn't a bad thing, as there's less need for components like accelerometers, gyroscopes, or even LiDAR depth sensors, which would otherwise add to the bulkiness of the glasses. The more similar these are to traditional eyewear, the better. Meta is also pitching its new Ray-Bans with a long-rumored EMG (electromyography) wristband. The company demoed the wearable technology during last year's Connect event, alongside its Orion prototype, and my CNET colleague, Scott Stein, said it worked surprisingly well. Stein even believes that it won't be the HUD display or redesigned frames that steal the show this year, but the EMG wristband, which allows users to navigate, type, and interact with the glasses' visual interface with taps, pinches, and swipes. Also: Meta wears Prada? Why its next-gen AR glasses might out-style the Ray-Bans That's promising to hear, considering I'd rather make finger gestures while walking around or sitting down than shout at an invisible voice assistant or wave my arms around in public, as I did with the Apple Vision Pro and Quest 3. The questions I have now are: How long will the wristband last, and what else can it do beyond detecting hand gestures? And are consumers even ready to charge yet another device -- on top of their phones, tablets, laptops, smartwatches, earbuds, and everything else? These questions -- and plenty more -- are exactly why I'll be tuning into Meta Connect this week. If Mark Zuckerberg truly believes the future of consumer tech lies in the frames in front of our eyes, he'll need more than a flashy demo. He'll need to convince consumers it's worth the space in their lives, and convince developers it's worth betting their future on.
[4]
Meta's Smart Glasses With Display Leaked Ahead of Connect Event
Meta's previously announced Oakley HSTN AI Glasses (Credit: Andrew Gebhart) Don't miss out on our latest stories. Add PCMag as a preferred source on Google. Meta's long-rumored smart glasses with a heads-up display have been leaked before of a potential reveal at a Connect event this week. The leaked material appears to be a promotional video shot by the company itself. UploadVR, which reported the video first, says that it was briefly available on Meta's YouTube channel. The glasses, reportedly developed under the codename Hypernova, will launch under the Ray-Ban brand. In line with existing rumors, the leaked promo suggests that the display would appear in the right lens of the glasses. It will let you interact with Meta AI, navigate maps, and translate signs, but won't be visible to a person standing in front of you. You'll also be able to control the wearable using a wristband. This could be the electromyography (EMG) accessory Meta teased in 2022. The tech translates your finger and hand movements to actions on a digital screen. In the leaked promo, a person wearing the wristband moved their finger across a surface to type out and send a message. The extended version of the same promo shows a new offering from the Meta-Oakley partnership as well. Those glasses are designed around Oakley's Sphaera lineup and feature a camera at the center of the frame. These don't appear to have a built-in display and could be intended for high-performance athletes. Last month, Bloomberg reported that the display-equipped wearable could start at $800. We should have a clearer picture of that and other leaked products on Wednesday, when Meta CEO Mark Zuckerberg is scheduled to deliver the Connect keynote.
[5]
Meta to debut costlier smart glasses with display at annual Connect event
Sept 16 (Reuters) - Meta (META.O), opens new tab is expected to double down on AI-powered augmented reality products with new smart glasses at its annual Connect event on Wednesday, even as the company faces scrutiny over its handling of child safety on its social media platforms. At its Menlo Park, California-based headquarters, CEO Mark Zuckerberg is expected to unveil Meta's first consumer-ready smart glasses with a built-in display, a device that analysts predicted will retail for about $800. Internally codenamed "Hypernova," the glasses are expected to be launched as "Celeste," analysts said, and will feature a small digital display in the right lens for basic functions such as notifications. The new glasses are the latest in Meta's effort to stay relevant in the AI race, where it is lagging rivals such as OpenAI and Alphabet's (GOOGL.O), opens new tab Google, but analysts said the device's hefty price tag could deter buyers. The product will likely be much less advanced than the "Orion" prototype glasses that Meta showcased at last year's event, a device that Zuckerberg called "the time machine to the future." The company did not immediately respond to an emailed request for comment on the new glasses. Meta, which expects to launch Orion in 2027, currently offers two lines of glasses - in collaboration with Ray Ban and Oakley - that incorporate artificial intelligence features, cameras, hands-free control and livestreaming to Meta's social media platforms, Facebook and Instagram. Zuckerberg, who has poured more than $60 billion since 2020 into Meta's augmented reality unit, has said that smart glasses will be the company's main conduit to integrate superintelligence - a hypothetical concept where AI surpasses human intelligence in every possible way - into everyday human lives. In the effort to catch up in AI, Zuckerberg has sparked a billion-dollar talent war, aggressively poaching researchers from rivals, while whistleblowers have said Meta was putting profit over user safety. Reuters reported last month that Meta's AI policies allowed its chatbots to engage children in provocative conversations about sex and race, and whistleblowers said earlier this month Meta's researchers were told not to investigate harms to children using its virtual reality technology so that the company could claim ignorance of the problem. Meta told Reuters previously that it has removed portions in its policies that stated it was permissible for chatbots to engage in romantic roleplay with children. BIG TICKET PRICE MAY DETER BUYERS At the two-day Connect conference, the company is also expected to launch its first wristband that allows users to control the new glasses with hand gestures. It is also expected to show an updated Ray-Ban line that comes with better cameras and battery life and supports new AI features, analysts said. Meta is the rare Big Tech company to gain consumer traction in smart glasses, selling about two million pairs of the Ray-Ban line it makes with EssilorLuxottica (ESLX.PA), opens new tab since 2023, in a market where rival bets such as Google Glasses have stumbled. But the unit has posted billions in losses. CNBC has reported the Hypernova glasses could feature Prada (1913.F), opens new tab branding, as the Italian label is known for thick frames and arms that could house many of the necessary components. Prada did not respond immediately to an emailed request for comment. Still, analysts said the expected $800 sticker price for the glasses - much higher than the $299 starting price for the Ray-Ban line and $399 for the sportier Oakley glasses - will mean that the device will have a negligible share of the market. "These glasses will be somewhat bulky ... not the most consumer-friendly design. It is also going to be pretty expensive. So the volumes are going to be fairly low," said Jitesh Ubrani, research manager for International Data Corp's worldwide mobile device tracker. He estimated the device would sell "a few hundred thousand units at most" but could help get more developers on board to build apps for it. "This is a step to eventually build a much-better mass-market headset." As part of efforts to attract developers, Meta is also expected to open its smart glasses to third-party developers with a new software kit. Reporting by Aditya Soni and Echo Wang; Editing by Sayantani Ghosh and Sonali Paul Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence * Medtech Echo Wang Thomson Reuters Echo Wang is a correspondent at Reuters covering U.S. equity capital markets, and the intersection of Chinese business in the U.S, breaking news from U.S. crackdown on TikTok and Grindr, to restrictions Chinese companies face in listing in New York. She was the Reuters' Reporter of the Year in 2020.
[6]
What to expect from Meta Connect 2025 this week: Ray-Ban smart glasses, Hypernova, more
Livestreams on Facebook, Meta's website, and the Quest Horizon platform will be open for public viewing. Meta Connect 2025 will take place on Sept. 17 and 18. The company is expected to showcase several new products and use the event to sharpen its XR strategy toward AI-driven hardware that can be worn and used today. Last year's Connect 2024 conference brought some notable improvements to its Meta Ray-Ban smart glasses, including multimodal video support, live translations, and natural language processing. Also: 5 Meta Ray-Ban upgrades that have me truly hyped for September 17 Meta also previewed advances in Llama 3, showing how its AI research was driving new features. Connect 2024 showed how consumers could continue to embrace AI wearables, setting the stage for Meta to make a bigger leap into display-driven smart glasses and a full developer platform to support them. Here's what's on the docket for 2025, and how to tune in. A device that is set to steal the show is the heavily rumored Hypernova smart glasses, which could ship under the retail name Meta Celeste. The glasses are expected to include a waveguide display built into the lens and a heads-up interface ideal for navigation, notification triage, and supported apps. Having experienced Google's Project Astra smartglasses display with a similar waveguide technology powered by Raxium's MicroLED technology, I'm excited to see how Meta's solution stacks up. Also: Meta wears Prada? Why its next-gen AR glasses might out-style the Ray-Bans A neural wristband, codenamed Ceres, is expected as a control accessory for the glasses. The wristband is said to read tiny electrical signals from the wrist muscles and convert them into commands for the glasses. I tested the Mudra Link wristband controller at CES earlier this year, which offers a similar level of control, and was surprised at the flexibility and precise nature of control. I'm particularly interested in how Meta plans to integrate the two technologies together. The Meta Ray-Ban frames also have the potential for a design refresh with a third-generation release. This time around, the update is expected to offer improved battery life and camera quality. In particular, I'm really hoping for a way to record landscape video with the onboard camera system instead of being locked into vertical orientation as we've seen so far. The glasses are also rumored to offer two separate versions, one for sunglasses and the other for optical glasses with an everyday wear use case. Also: Samsung 'Galaxy Glasses' powered by Android XR are reportedly on track to be unveiled this month Another possible update is reportedly known internally as "super sensing" which would allow the glasses to continuously scan and contextualize the environment surrounding the wearer. This, coupled with any sort of memory, could be incredibly powerful in reminding users of things they might have misplaced or even faces and names they may have forgotten. Perhaps the most important announcement isn't the hardware but the software. The schedule for the event has multiple references to a software development kit that will likely open its platform to third-party developers for the first time. The open SDK could give developers what they need to build apps for navigation, fitness, translation, and enterprise use cases. Opening the hardware up like this could make Meta's smartglasses even more appealing to users, and I'm excited to see what developers come up with when they get the chance. Virtual reality is expected to take a bit of a back seat at Connect 2025, with no new Quest hardware to be announced this year. However, there are rumors that point to a possible debut of the ASUS ROG Tarius headset, the first third-party device to run Horizon OS after it was opened up to development by other OEMs. The ROG Tarius is rumored to be a premium offering with eye and face tracking, along with upgraded QD-LCD or µOLED displays on board. Also: Stanford's holographic AI glasses are coming for your clunky VR headset At the same time, Horizon Worlds is set to receive some impressive updates, including generative AI tools for creators who want to design unique environments using simple text prompts. Llama models will work to integrate interactive characters into those spaces with custom personalities and natural conversation capabilities. A lowering of the technical requirements for such creation could bring fresh creativity to the platform. Meta AI will likely be a major theme throughout Connect 2025, as a consumer product and as the foundation for Meta's longer-term vision. Rumors point to an upcoming wave of character-driven bots tailored to non-English speakers to expand on the reach of Meta's chatbots. It's also possible that Zuckerberg will shed more light on the company's "superintelligence" commitment and how its current efforts tie into the long-term vision. Zuckerberg could also address Meta's slipping timeline for its Llama roadmap and its shift away from its past open-source commitments. The main keynote with Mark Zuckerberg begins Wednesday, September 17, at 5:00 p.m. Pacific time, 8:00 p.m. Eastern time. The developer keynote follows on Thursday morning at 10:00 a.m. Pacific, 1:00 p.m. Eastern. While developers, media, and industry analysts have been invited to attend the event in person, Meta's official Connect website and Facebook Live will offer streams for public viewing. Quest users can also experience it in virtual reality inside Horizon Worlds. ZDNET will be reporting live from the event, so stay tuned for updates.
[7]
Meta expected to unveil new smart glasses at Connect event
Meta is expected to show off artificial intelligence-powered smart glasses at its Connect developer conference Wednesday as CEO Mark Zuckerberg continues to evangelize the glasses as the next step in human-computer interactions. "Last year's Meta Connect was far less metaverse and far more AI. Expect this year's event to be virtually dominated by AI, specifically AI glasses and superintelligence," said Forrester research director Mike Proulx. "While Meta has the head start on AI glasses, competition is chomping at the bit with new entrants." The Menlo Park, California-based company teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year -- but these holographic augmented reality glasses are still years away from being on the market. Instead, analysts expect Meta to show off new smart glasses, likely with a small display controlled by a wristband the user wears. "While more of an experimentation platform, they should enable consumers to access time, weather, notifications, frame and preview pictures, show captions and translate speeches, allow early integration of Meta's assets (WhatsApp and Instagram), and display Meta AI responses," said Forrester's Thomas Husson. Meta is also expected to show off updates to its Ray-Ban Meta smart glasses, which "are likely to have enhanced AI capabilities interpreting the user's surroundings and context," the analyst added. While the company has not disclosed sales figures of the glasses, it said they've been more popular than expected, helped by social media creators. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now." Of course, the company will also likely share AI updates, including to its standalone Meta AI app. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress." While he said that developing superintelligence is now "in sight," he did not detail how this will be achieved or exactly what "superintelligence" means. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence, or AGI. It's the latest pivot for a tech leader who in 2021 went all-in on the idea of the metaverse, changing the company's name and investing billions into advancing virtual reality and related technology. Zuckerberg said he believes AI glasses are going to be "the main way we integrate superintelligence." Zuckerberg will deliver his keynote Wednesday at 8 p.m. Eastern.
[8]
Meta Connect 2025 LIVE -- Ray-Ban Meta Display smart glasses, new Oakley glasses and all the news as it happens
Get ready for a new era of smart glasses, as Meta Connect 2025 is officially kicking off today (September 17) at 5 p.m. PT / 8 p.m. ET / 1 a.m. BST, and we're in for quite the show. Kicking off with a keynote from CEO Mark Zuckerberg, Meta's annual showcase will put a spotlight on AI glasses this time around, and it already looks like we'll get a peek at the leaked Ray-Ban Meta Display smart glasses, the new Oakley Meta Sphaera glasses and more. We may even see an all-new VR headset -- even if it isn't the anticipated Meta Quest 4. In fact, as per the Meta Connect 2025 schedule, it's now guaranteed that we'll see major product announcements at the show. Along with its "latest line of new products and updates," Zuckerberg will be sharing the "latest on AI glasses" and "Meta's vision for artificial intelligence and the metaverse." There's more to come, as the developer keynote on Thursday, September 18 at 10 a.m. PT / 1 p.m. ET / 6 p.m. BST will also deliver plenty of announcements about the "future of glasses with contextual AI" and the "future of computing." We'll be on the ground covering everything that will be revealed, and get hands-on with all the smart glasses set to arrive. For all the latest updates in the lead-up to Meta Connect 2025, how to watch the big keynote and last-minute rumors, I'll be delving into it all right here. Meta Connect 2025 will take place from Wednesday, September 17, to Thursday, September 18, with keynotes and developer sessions throughout the showcase. However, for the biggest announcements from Mark Zuckerberg, you'll want to watch the Connect Keynote on Day 1 at 5 p.m. PT / 8 p.m. ET / 1 a.m. BST. To tune into the Connect Keynote, you can catch the livestream on Facebook or through Meta Horizon on Quest headsets. As with last year's event, the livestream should be available via Meta Developers YouTube channel. Of course, you can follow all the build-up and announcements as they happen right here in our live blog, so stay tuned!
[9]
Meta Connect 2025 event live updates -- 'Hypernova' smart glasses, Ray-Ban Meta 3 and all the latest rumors
Meta Connect 2025 is officially set for Wednesday, September 17 at 5 p.m. PT / 8 p.m. ET / 1 a.m. BST, and it's already looking to give us a look at Meta's plans for the future of smart glasses. Kicking off with a keynote from CEO Mark Zuckerberg, Meta's annual showcase will put a spotlight on AI glasses this time around, meaning we may finally see its long-rumored Meta "Hypernova" smart glasses, next-gen Ray-Ban Meta Gen 3 specs and even an all-new VR headset -- even if it isn't the anticipated Meta Quest 4. In fact, as per the Meta Connect 2025 schedule, it's now guaranteed that we'll see major product announcements at the show. Along with its "latest line of new products and updates," Zuckerberg will be sharing the "latest on AI glasses" and "Meta's vision for artificial intelligence and the metaverse." Moreover, a now-removed YouTube video from Meta showed the Meta Ray-Ban Display smartglasses, giving us good reason to believe this will be on show. It even showed off the wristband and what appears to be a heads-up display. Plus, with a developer keynote on Thursday, September 18 at 10 a.m. PT / 1 p.m. ET / 6 p.m. BST, expect plenty of announcements about the "future of glasses with contextual AI" and the "future of computing." So, be prepared for a jam-packed event, and we'll be on the ground to get a hands-on look at everything announced! For all the latest updates in the lead-up to Meta Connect 2025, how to watch the big keynote and last-minute rumors, I'll be delving into it all right here. Meta Connect 2025 will take place from Wednesday, September 17, to Thursday, September 18, with keynotes and developer sessions throughout the showcase. However, for the biggest announcements from Mark Zuckerberg, you'll want to watch the Connect Keynote on Day 1 at 5 p.m. PT / 8 p.m. ET / 1 a.m. BST. To tune into the Connect Keynote, you can catch the livestream on Facebook or through Meta Horizon on Quest headsets. As with last year's event, the livestream should be available via Meta Developers YouTube channel. Of course, you can follow all the build-up and announcements as they happen right here in our live blog, so stay tuned!
[10]
Meta Connect 2025: Smarter glasses and maybe a smart watch
"This will be the year when we understand the trajectory for AI glasses," Meta CEO Mark Zuckerberg said on an earnings call. Meta is setting the stage for its annual technology showcase at its so-called Meta Connect 2025 event, which will kick off on Wednesday at 5 p.m. local time (2 a.m. CET). While the Facebook, Instagram, and WhatsApp parent company, which also makes smart glasses, has not given away too many details of what we can expect at the unveiling, the company's CEO, Mark Zuckerberg, has been teasing some possible announcements. Here are some things we might see. Better Meta smart glasses It is widely believed that Meta will launch its third generation of smart glasses. "Our Ray-Ban Meta AI glasses are a real hit, and this will be the year when we understand the trajectory for AI glasses as a category," Zuckerberg said on a call with investors in January. "Many breakout products in the history of consumer electronics have sold 5-10 million units in their third generation. This will be a defining year that determines if we're on a path towards many hundreds of millions and eventually billions of AI glasses - and glasses being the next computing platform like we've been talking about for some time - or if this is just going to be a longer grind," he added. Meta launched its first pair of smart glasses in 2021 with Ray-Ban maker EssilorLuxottica.It has since also teamed up with Oakley. The current Ray-Ban Meta Glasses and Oakley Meta AI Glasses can access Meta AI for vocal queries, live translation and can make calls or texts when connected to a smartphone. If Zuckerberg is good on his word, we could see more use of Meta AI. The Information reported that Meta may show off a Live AI upgrade, which would mean the glasses could use AI to "see what its user does and respond in real-time, for hours". Meanwhile, the upgraded Ray-Ban/Meta smart glasses have been code-named Aperol and Bellini, according to a group calling itself the XR Research Institute. Celeste Details on another pair of Meta smart glasses, which has been rumoured to be called Celeste, may also be shared on Wednesday. Celeste, if it keeps its code name, is supposed to look like the 2023 Ray-Ban smart glasses with all the features - but have a display as well. Media reports have said the glasses probably won't be ready this year, but Zuckerberg could talk concretely about it. A smartwatch? Meta does not have a smartwatch.It has long been rumoured to be in development as well as long been rumoured to be cancelled. According to a DigiTimes report, Meta is "reviving" its smartwatch plans and developing a new wearable device, which may have a built-in camera lens. It's reported to work alongside its upcoming smart glasses. The metaverse Though Meta has shifted its focus more towards artificial intelligence (AI), it rebranded its name from Facebook after it decided to go all in on the metaverse in 2021. Meta's chief technology officer Andrew Bosworth teased some metaverse software ahead of the Meta Connect event. This might include some games or virtual reality (VR) versions of streaming services, Tech Radar predicted. But there has been no official mention of any new Meta VR headsets. Then again, Meta could surprise us.
[11]
Meta expected to unveil new smart glasses at Connect event
Meta is expected to show off artificial intelligence-powered smart glasses at its Connect developer conference Wednesday as CEO Mark Zuckerberg continues to evangelize the glasses as the next step in human-computer interactions. "Last year's Meta Connect was far less metaverse and far more AI. Expect this year's event to be virtually dominated by AI, specifically AI glasses and superintelligence," said Forrester research director Mike Proulx. "While Meta has the head start on AI glasses, competition is chomping at the bit with new entrants." The Menlo Park, California-based company teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year -- but these holographic augmented reality glasses are still years away from being on the market. Instead, analysts expect Meta to show off new smart glasses, likely with a small display controlled by a wristband the user wears. "While more of an experimentation platform, they should enable consumers to access time, weather, notifications, frame and preview pictures, show captions and translate speeches, allow early integration of Meta's assets (WhatsApp and Instagram), and display Meta AI responses," said Forrester's Thomas Husson. Meta is also expected to show off updates to its Ray-Ban Meta smart glasses, which "are likely to have enhanced AI capabilities interpreting the user's surroundings and context," the analyst added. While the company has not disclosed sales figures of the glasses, it said they've been more popular than expected, helped by social media creators. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now." Of course, the company will also likely share AI updates, including to its standalone Meta AI app. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress." While he said that developing superintelligence is now "in sight," he did not detail how this will be achieved or exactly what "superintelligence" means. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence, or AGI. It's the latest pivot for a tech leader who in 2021 went all-in on the idea of the metaverse, changing the company's name and investing billions into advancing virtual reality and related technology. Zuckerberg said he believes AI glasses are going to be "the main way we integrate superintelligence." Zuckerberg will deliver his keynote Wednesday at 8 p.m. Eastern.
[12]
Meta expected to unveil new smart glasses at Connect event
Meta is expected to unveil updated, artificial intelligence-powered smart glasses at its Connect developer conference kicking off on Wednesday as CEO Mark Zuckerberg continues to evangelize the glasses as the next step in human-computer interactions Meta is expected to show off artificial intelligence-powered smart glasses at its Connect developer conference Wednesday as CEO Mark Zuckerberg continues to evangelize the glasses as the next step in human-computer interactions. "Last year's Meta Connect was far less metaverse and far more AI. Expect this year's event to be virtually dominated by AI, specifically AI glasses and superintelligence," said Forrester research director Mike Proulx. "While Meta has the head start on AI glasses, competition is chomping at the bit with new entrants." The Menlo Park, California-based company teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year -- but these holographic augmented reality glasses are still years away from being on the market. Instead, analysts expect Meta to show off new smart glasses, likely with a small display controlled by a wristband the user wears. "While more of an experimentation platform, they should enable consumers to access time, weather, notifications, frame and preview pictures, show captions and translate speeches, allow early integration of Meta's assets (WhatsApp and Instagram), and display Meta AI responses," said Forrester's Thomas Husson. Meta is also expected to show off updates to its Ray-Ban Meta smart glasses, which "are likely to have enhanced AI capabilities interpreting the user's surroundings and context," the analyst added. While the company has not disclosed sales figures of the glasses, it said they've been more popular than expected, helped by social media creators. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now." Of course, the company will also likely share AI updates, including to its standalone Meta AI app. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress." While he said that developing superintelligence is now "in sight," he did not detail how this will be achieved or exactly what "superintelligence" means. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence, or AGI. It's the latest pivot for a tech leader who in 2021 went all-in on the idea of the metaverse, changing the company's name and investing billions into advancing virtual reality and related technology. Zuckerberg said he believes AI glasses are going to be "the main way we integrate superintelligence." Zuckerberg will deliver his keynote Wednesday at 8 p.m. Eastern.
[13]
Meta Expected to Unveil New Smart Glasses at Connect Event
Meta is expected to show off artificial intelligence-powered smart glasses at its Connect developer conference Wednesday as CEO Mark Zuckerberg continues to evangelize the glasses as the next step in human-computer interactions. "Last year's Meta Connect was far less metaverse and far more AI. Expect this year's event to be virtually dominated by AI, specifically AI glasses and superintelligence," said Forrester research director Mike Proulx. "While Meta has the head start on AI glasses, competition is chomping at the bit with new entrants." The Menlo Park, California-based company teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year -- but these holographic augmented reality glasses are still years away from being on the market. Instead, analysts expect Meta to show off new smart glasses, likely with a small display controlled by a wristband the user wears. "While more of an experimentation platform, they should enable consumers to access time, weather, notifications, frame and preview pictures, show captions and translate speeches, allow early integration of Meta's assets (WhatsApp and Instagram), and display Meta AI responses," said Forrester's Thomas Husson. Meta is also expected to show off updates to its Ray-Ban Meta smart glasses, which "are likely to have enhanced AI capabilities interpreting the user's surroundings and context," the analyst added. While the company has not disclosed sales figures of the glasses, it said they've been more popular than expected, helped by social media creators. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now." Of course, the company will also likely share AI updates, including to its standalone Meta AI app. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress." While he said that developing superintelligence is now "in sight," he did not detail how this will be achieved or exactly what "superintelligence" means. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence, or AGI. It's the latest pivot for a tech leader who in 2021 went all-in on the idea of the metaverse, changing the company's name and investing billions into advancing virtual reality and related technology. Zuckerberg said he believes AI glasses are going to be "the main way we integrate superintelligence." Zuckerberg will deliver his keynote Wednesday at 8 p.m. Eastern.
[14]
Meta Wants AI-Powered Smart Glasses to Drive New Growth
More smart glasses are expected from Apple, Amazon and other competitors in the coming years. Meta Platforms is looking to AI-powered glasses to drive growth, which some Wall Street analysts said could help position the tech giant as an early leader in a new category of devices. CEO Mark Zuckerberg is widely expected to showcase Meta's (META) latest AI glasses, called "Hypernova," at its "Connect" developer conference Wednesday and Thursday. (The keynote address is set to start at 8 p.m. ET Wednesday, Register for the livestream here.) Meta's existing glasses have shown some signs of success, with sales of its Ray-Ban AI glasses more than tripling in the first half of the year from a year ago.a The Hypernova glasses, developed with Ray-Ban and Oakley owner EssilorLuxottica, are viewed as a shift in Meta's AI product strategy away from virtual reality headsets. They come with more smart glasses expected from competitors: Apple (AAPL), South Korea's Samsung, and Amazon (AMZN) plan to launch their own versions next year and beyond, HSBC analysts wrote Tuesday. Zuckerberg during a conference call in July said he sees the AI glasses becoming a major opportunity for the company and the "main way that we integrate superintelligence into our day-to-day lives," according to a transcript provided by AlphaSense. "Smart glasses are a nascent market with a potential to become, if not 'the,' then at least 'one' computing platform of reference for the coming decades," bullish HSBC analysts led by Nicolas Cote-Colisson wrote. More technological progress may be required before the glasses can compete with functionalities currently offered by smartphones, HSBC wrote. Bank of America analysts, meanwhile, have raised concerns that Hypernova's projected $800 price tag might be too high for mass adoption, though they said it could help bridge a gap "between early adopters and mainstream consumers." Wall Street is broadly bullish on Meta's stock, with a majority of analysts tracked by Visible Alpha calling the stock a "buy." Their mean target, $874, would suggest 15% upside from Tuesday's close at $779. The stock has added a third of its value in 2025.
[15]
Meta expected to unveil new smart glasses at Connect event - The Economic Times
The Menlo Park, California-based company teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year - but these holographic augmented reality glasses are still years away from being on the market.Meta is expected to show off artificial intelligence-powered smart glasses at its Connect developer conference Wednesday as CEO Mark Zuckerberg continues to evangelize the glasses as the next step in human-computer interactions. "Last year's Meta Connect was far less metaverse and far more AI. Expect this year's event to be virtually dominated by AI, specifically AI glasses and superintelligence," said Forrester research director Mike Proulx. "While Meta has the head start on AI glasses, competition is chomping at the bit with new entrants." The Menlo Park, California-based company teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year - but these holographic augmented reality glasses are still years away from being on the market. Instead, analysts expect Meta to show off new smart glasses, likely with a small display controlled by a wristband the user wears. "While more of an experimentation platform, they should enable consumers to access time, weather, notifications, frame and preview pictures, show captions and translate speeches, allow early integration of Meta's assets (WhatsApp and Instagram), and display Meta AI responses," said Forrester's Thomas Husson. Meta is also expected to show off updates to its Ray-Ban Meta smart glasses, which "are likely to have enhanced AI capabilities interpreting the user's surroundings and context," the analyst added. While the company has not disclosed sales figures of the glasses, it said they've been more popular than expected, helped by social media creators. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now." Of course, the company will also likely share AI updates, including to its standalone Meta AI app. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress." While he said that developing superintelligence is now "in sight," he did not detail how this will be achieved or exactly what "superintelligence" means. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence, or AGI. It's the latest pivot for a tech leader who in 2021 went all-in on the idea of the metaverse, changing the company's name and investing billions into advancing virtual reality and related technology. Zuckerberg said he believes AI glasses are going to be "the main way we integrate superintelligence." Zuckerberg will deliver his keynote Wednesday at 8 p.m. Eastern.
[16]
Meta Connect Preview: How To Watch Mark Zuckerberg Speech, What's Next For AI, Smart Glasses & Metaverse - Meta Platforms (NASDAQ:META)
Technology giant Meta Platforms META will lay out new products and plans for its ambitions in artificial intelligence, smart glasses and the metaverse during the company's Meta Connect 2025 event. Here's a look at how to watch the event and what to expect. How to Watch Meta Connect 2025 Several reports on products that Meta will launch at the 2025 event have been coming ahead of the highly anticipated event. Meta Connect 2025 takes place on Wednesday and Thursday. Meta CEO Mark Zuckerberg will give a keynote at 8 p.m. ET on Wednesday. The keynote can be watched on Meta's Facebook page and also within Meta Horizon on VR headsets. "Join Mark Zuckerberg as he shares the latest on AI glasses and lays out Meta's vision for artificial intelligence and the metaverse," Meta's website says. On Thursday, Meta will host its Developer Keynote at 1 p.m. ET. Read Also: Wall Street Vs. Zuckerberg: Meta Shorts Dwarf Microsoft And Palantir Combined What to Expect at Meta Connect 2025 According to Meta, the event will lay out the company's future in AI and the metaverse. Viewers of the event can expect updates on smart glasses, according to reports. Meta is expected to show off new consumer smart glasses with a display. The glasses could be part of Meta's partnership with EssilorLuxottica ESLOY, the owner of Ray-Ban and Oakley. Meta is also expected to show off a wristband with gesture controls, which could put the company into new wearable categories alongside its plans for smart glasses. According to Meta's website, the company is "creating the future of human connection" and will highlight this at the event. Meta has put a larger emphasis on artificial intelligence in recent years and launched Meta Superintelligence Labs in June of this year. Investors could get more details on Meta Superintelligence Labs and its early progress in creating new growth verticals for the tech giant. Last year's Meta Connect saw the company highlight its more affordable Meta Quest 3S mixed reality headset, show off prototypes of the Orion smart glasses and highlight expansion into AI opportunities. "We are getting closer to achieving the dream of Reality Labs," Zuckerberg said at last year's event. The Reality Labs division of Meta has experienced sharp losses in recent years and also undergone job cuts. More highlights on how the division will grow revenue and turn profitable could be key highlights from this year's event. Meta Stock Reaction Meta stock trades at $775.67 at the time of writing, down 0.4% Wednesday. Shares have traded between $479.80 and $796.25 over the last 52 weeks, with shares up 29.4% year-to-date in 2025. On the day of last year's Meta Connect, the stock traded between $563.72 and $576.88. The next day, shares opened at $575.73 before closing the day down at $567.84. Meta's stock price fell after the Meta Connect 2024 event, but has since traded higher, with shares now up 34.5% since last year's event. Read Next: Meta Beats Q2 Earnings Estimates, Issues Strong Guidance, Shares Surge To New High Photo: Shutterstock METAMeta Platforms Inc$773.80-0.67%Stock Score Locked: Edge Members Only Benzinga Rankings give you vital metrics on any stock - anytime. Unlock RankingsEdge RankingsMomentum83.97Growth85.51Quality92.61Value23.21Price TrendShortMediumLongOverviewESLOYEssilorluxottica$160.04-0.53%Market News and Data brought to you by Benzinga APIs
[17]
Meta Platforms, Inc. (META) Presents at Connect 2025 Transcript
Meta Platforms, Inc. (NASDAQ:META) Connect 2025 September 17, 2025 8:01 PM EDT Company Participants Mark Zuckerberg - Founder, Chairman & CEO Andrew Bosworth - Chief Technology Officer Conference Call Participants Jack Mancuso James Cameron Andrew Boone - Citizens JMP Securities, LLC, Research Division Diplo Presentation Unknown Attendee Mark, we're ready for you. Mark Zuckerberg Founder, Chairman & CEO All right. [Presentation] Mark Zuckerberg Founder, Chairman & CEO Thank you. We'll talk about these in a minute. Welcome to Connect. All right. AI glasses and Virtual Reality. Our goal is to build great-looking glasses that deliver personal super intelligence and a feeling of presence using realistic holograms, and these ideas combined to what we call the Metaverse. Now glasses are the ideal form factor for personal super intelligence because they let you stay present in the moment, while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses and more. Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day and very soon generate whatever UI you need right in your vision in real time. So it is no surprise that AI glasses are taking off. This is now our third year shipping AI glasses with our great partner, EssilorLuxottica. And the sales trajectory that we've seen is similar to some of the most popular consumer electronics of all time. Now we are focused on designing glasses with a few clear values. Number one, they need to be great glasses first. Now before we get to any of the technology, the glasses need to be well designed and comfortable. And if you're going to wear glasses on your face all day, every day, and they need to be refined in their aesthetics
[18]
Meta to debut costlier smart glasses with display at annual Connect event
Meta is expected to double down on AI-powered augmented reality products with new smart glasses at its annual Connect event on Wednesday, even as the company faces scrutiny over its handling of child safety on its social media platforms. At its Menlo Park, California-based headquarters, CEO Mark Zuckerberg is expected to unveil Meta's first consumer-ready smart glasses with a built-in display, a device that analysts predicted will retail for about US$800. Internally codenamed "Hypernova," the glasses are expected to be launched as "Celeste," analysts said, and will feature a small digital display in the right lens for basic functions such as notifications. The new glasses are the latest in Meta's effort to stay relevant in the AI race, where it is lagging rivals such as OpenAI and Alphabet's Google, but analysts said the device's hefty price tag could deter buyers. The product will likely be much less advanced than the "Orion" prototype glasses that Meta showcased at last year's event, a device that Zuckerberg called "the time machine to the future." The company did not immediately respond to an emailed request for comment on the new glasses. Meta, which expects to launch Orion in 2027, currently offers two lines of glasses - in collaboration with Ray Ban and Oakley - that incorporate artificial intelligence features, cameras, hands-free control and livestreaming to Meta's social media platforms, Facebook and Instagram. Zuckerberg, who has poured more than US$60 billion since 2020 into Meta's augmented reality unit, has said that smart glasses will be the company's main conduit to integrate superintelligence - a hypothetical concept where AI surpasses human intelligence in every possible way - into everyday human lives. In the effort to catch up in AI, Zuckerberg has sparked a billion-dollar talent war, aggressively poaching researchers from rivals, while whistleblowers have said Meta was putting profit over user safety. Reuters reported last month that Meta's AI policies allowed its chatbots to engage children in provocative conversations about sex and race, and whistleblowers said earlier this month Meta's researchers were told not to investigate harms to children using its virtual reality technology so that the company could claim ignorance of the problem. Meta told Reuters previously that it has removed portions in its policies that stated it was permissible for chatbots to engage in romantic roleplay with children. At the two-day Connect conference, the company is also expected to launch its first wristband that allows users to control the new glasses with hand gestures. It is also expected to show an updated Ray-Ban line that comes with better cameras and battery life and supports new AI features, analysts said. Meta is the rare Big Tech company to gain consumer traction in smart glasses, selling about two million pairs of the Ray-Ban line it makes with EssilorLuxottica since 2023, in a market where rival bets such as Google Glasses have stumbled. But the unit has posted billions in losses. CNBC has reported the Hypernova glasses could feature Prada branding, as the Italian label is known for thick frames and arms that could house many of the necessary components. Prada did not respond immediately to an emailed request for comment. Still, analysts said the expected US$800 sticker price for the glasses - much higher than the US$299 starting price for the Ray-Ban line and US$399 for the sportier Oakley glasses - will mean that the device will have a negligible share of the market. "These glasses will be somewhat bulky ... not the most consumer-friendly design. It is also going to be pretty expensive. So the volumes are going to be fairly low," said Jitesh Ubrani, research manager for International Data Corp's worldwide mobile device tracker. He estimated the device would sell "a few hundred thousand units at most" but could help get more developers on board to build apps for it. "This is a step to eventually build a much-better mass-market headset." As part of efforts to attract developers, Meta is also expected to open its smart glasses to third-party developers with a new software kit. ---
[19]
Meta at Connect 2025: Expanding the Metaverse Vision By Investing.com
On Thursday, 18 September 2025, Meta Platforms Inc (NASDAQ:META) unveiled its latest advancements at the Connect 2025 conference. The event highlighted Meta's strategic focus on artificial intelligence, augmented reality glasses, and virtual reality, underscoring both promising innovations and technical challenges. While the company showcased its commitment to the metaverse, some live demonstrations faced technical difficulties, reflecting the complexities of cutting-edge technology. Key Takeaways * Meta introduced a new lineup of AI-powered glasses, including the Ray-Ban Meta Smart Glasses and the Oakley Meta Vanguard. * The Meta Neural Band was unveiled as a neural interface for controlling new display technologies. * Meta Horizon Studio and Meta Horizon Engine were announced to enhance the creation of immersive virtual content. * Partnerships with Garmin and Strava aim to integrate fitness functionalities into Meta's glasses. * Filmmaker James Cameron discussed his collaboration with Meta, highlighting advancements in 3D filmmaking. New Glasses Lineup Announcement Meta announced an array of new smart glasses designed to enhance user experience with advanced features: * Ray-Ban Meta Smart Glasses * Oakley Meta Houston and Vanguard * Meta Ray-Ban Display Virtual Reality and Metaverse Updates Meta is advancing its virtual reality capabilities with new tools and platforms: * Meta Horizon Studio * Meta Horizon Engine * Horizon TV James Cameron Interview Highlights Filmmaker James Cameron shared insights into the future of 3D filmmaking: * Praised the Meta Quest 3 for its high brightness and visual fidelity * Discussed efforts to reduce the cost and complexity of 3D production * Mentioned the upcoming release of Avatar: Fire and Ash on December 19 For a detailed understanding of Meta's strategic initiatives, refer to the full conference call transcript below. Full transcript - Connect 2025: Boz, Meta: Welcome to Connect. All right. AI, glasses, and virtual reality. Our goal is to build great-looking glasses that deliver personal superintelligence and a feeling of presence using realistic holograms. These ideas combined are what we call the metaverse. Now, glasses are the ideal form factor for personal superintelligence because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more. Glasses are the only form factor where you can let an AI see what you see, hear what you hear, talk to you throughout the day, and very soon generate whatever UI you need right in your vision in real time. It is no surprise that AI glasses are taking off. This is now our third year shipping AI glasses with our great partner, EssilorLuxottica. The sales trajectory that we've seen is similar to some of the most popular consumer electronics of all time. We are focused on designing glasses with a few clear values. Number one, they need to be great glasses first. Before we get to any of the technology, the glasses need to be well-designed and comfortable. If you're going to wear glasses on your face all day, every day, then they need to be refined in their aesthetics, and they need to be light. In addition to working with iconic brands, we have spent years of engineering, obsessing over how to shave every fraction of a millimeter and portion of a gram that we can from every pair of glasses that we ship. I think that shows in the work. Number two, the technology needs to get out of the way. The promise of glasses is to preserve this sense of presence that you have when you're with other people. This feeling of presence, it's a profound thing. I think that we've lost it a little bit with phones, and we have the opportunity to get it back with glasses. When we're designing the hardware and software, we focus on giving you access to very powerful tools when you want them, and then just having them fade into the background otherwise. Number three, take superintelligence seriously. This is going to be the most important technology in our lifetimes. AI should serve people, not just be something that sits in a data center automating large parts of society. We design our glasses to be able to empower people with new capabilities as soon as they become possible. We think in advance about what kind of sensors are going to be necessary, and we make it so that you can just update your software and make your glasses and yourself smarter and direct AI towards what matters most in your life. All right. With all that said, we do have some new glasses to show you today. I want to start with these, the next generation of Ray-Ban Meta Smart Glasses. Now, these are the original and iconic design. I think that this is actually the most popular glasses design in history. Now, with double the battery life, I wear them all day. They never run out of battery. It's got 3K video recording, double our previous resolution for sharper, smoother, and more vivid videos. These are all shot with Ray-Ban Meta Smart Glasses. Meta AI keeps on getting better. Last year, I did this live demo, translating live between two people. We were doing that on stage. Today, I am excited to introduce a feature that we call Conversation Focus. It's a new feature coming soon that is going to be able to amplify your friends' voices in your ear. If you're in a noisy restaurant, you're basically going to be able to turn up the volume on your friends or whoever you're talking to. Conversation Focus is not only going to be on the new Ray-Ban Meta Smart Glasses, it's going to be available as a software update on all of the existing Ray-Ban Meta Smart Glasses too. To show this, we've got Johnny Cirillo and Jack Coyne in the streets of New York. Check out how this works. Unidentified speaker: Hi, Johnny. Hello. How are you? Got the Renaissance vibe going up there. Going off, baby. Jack. I'll just put my name in. It's going to be a couple of minutes. Nice. I need your advice. OK. Every time I get my picture taken, I feel like I'm not being normal. I want to feel like just a regular person when I'm. One sec, Jack. Hey, Meta, start Conversation Focus. Meta AI, AI Assistant, Meta: Starting Conversation Focus. Unidentified speaker: OK, go on. As soon as the camera comes up, I start to have this serious. Deer in headlights. Yeah. Yeah. How do I be more normal? Come on, man. How do I be more natural when I'm getting my picture taken? Sometimes I play around with something, like your collar fits your sleeve a little bit. Just like sort of action. Like nobody's around. You know what I mean? You got to just figure out getting ready. All right. It's Conversation Focus. All right. We are also improving live AI. As we optimize battery and energy efficiency, Meta AI is going to transition from being something that you invoke when you have a question to a service that is running all the time and helping you out throughout the day. Now, to be clear, we're not there yet on all-day live AI use. This is one of the major technology challenges that we're still working through. Today, you can use live AI for about an hour or two straight. To get a feeling for what this is like, let's cut to chef Jack Mancuso, who's coming to us live from a kitchen on Meta's campus preparing for the afterparty. How's it going, chef? All right. What do you think? Maybe let's make, I don't know what you would make, maybe like a steak sauce, maybe Korean-inspired type thing, you know, just to show what the live AI is like. Yeah, let's try it. It's not something I've made before, so I could definitely use the help. Hey, Meta, start live AI. Meta AI, AI Assistant, Meta: Starting live AI. I love the setup you have here with soy sauce and other ingredients. How can I help? Unidentified speaker: Hey, can you help me make a Korean-inspired steak sauce for my steak sandwich here? Meta AI, AI Assistant, Meta: You can make a Korean-inspired steak sauce using soy sauce, sesame oil. Unidentified speaker: What do I do first? What do I do first? Meta AI, AI Assistant, Meta: You've already combined the base ingredients. Now, grate a pear to add to the sauce. Unidentified speaker: What do I do first? Meta AI, AI Assistant, Meta: You've already combined the base ingredients. Now, grate the pear and gently combine it with the base sauce. Unidentified speaker: All right. I think the Wi-Fi might be messed up. Sorry. Back to you, Mark. It's all good. You know what? It's all good. The irony of the whole thing is that you spend years making technology, and then the Wi-Fi at the day kind of catches you. All right. Anyway, we'll go check out what he made later. OK. Now, when you're building glasses, there is an important intersection between technology and fashion and style. The technology keeps getting more useful. As I said before, first and foremost, these need to be great-looking glasses that people love to wear. We are releasing the Ray-Ban Meta in more colorways to match your style. Here's a bunch of the new ones. Last year at Connect, we also released a limited edition clear frames, and they were pretty popular. They sold out in a few days. We've got a new limited edition transparent matte frames with two colors. Get them quickly because they're probably going to be sold out in a few days, too. All right. It's been pretty fun to see how designers have taken Ray-Ban Meta in a lot of different directions. Some of you probably are familiar with the fashion label Luar, run by Raul Lopez. He's a bold designer who's bringing together sportswear and high fashion. He recently debuted a look that's centered on Ray-Ban Meta at New York Fashion Week. Raul's actually here today, along with Kristi Baez, modeling the look that he created. Awesome. Good to see you. All right. That's the next generation of Ray-Ban Meta. We're really excited about this. They're available now starting at $379. This summer, we launched our first pair of AI glasses with Oakley, the Oakley Meta Houston. It's another iconic brand that we're working with. Oakley is synonymous with sports for 50 years now. They're available in a number of great colors. Today, I am excited to add to our Oakley collection and announce the brand new Oakley Meta Vanguard. This is the iconic Oakley aesthetic. These glasses are designed for performance. On these, we push the battery even further. You can run a marathon using them the whole time on a single charge. You can turn around and run another marathon on the same charge and still not be out of battery. The camera is centered for perfect alignment for your shots. It's got a wider 122° field of view, so you can capture all the epicness of your adventure in 3K. It's got video stabilization. That means that as you're going down a trail, you're going to be able to capture some really great video. All right. The open-ear speakers are the most powerful speakers that we've shipped yet, with six decibels louder than Oakley Meta Houston. They're great for running on a noisy road or biking in 30-mile-an-hour winds. I actually took a call on a jet ski a few weeks ago. It was great. I could hear the other person fine over the engine. Our advanced wind noise reduction makes it so that you can basically be standing in a wind tunnel, and you'd still come in clear to the person on the other side. The person had no idea I was on a jet ski, which is good. All right. We've added slow motion and hyperlapse capture modes, so you can capture your adventures in new ways. These modes are also going to be available on all the new glasses that we're announcing here, the new Ray-Ban Meta, the new Oakley Meta Houston, too. You can get great footage with any of the glasses. We are partnering with Garmin and introducing auto capture. Now, if you're wearing a Garmin device, the glasses are going to be able to automatically capture video when you reach certain speeds or different distance intervals or, you know, like every mile of a marathon. When you're done, we'll just stitch together all the videos for you, and you can overlay the stats on top of them, and you get a nice video that you can share wherever you want. We're also partnering with Strava. You can overlay your stats from Strava, too, and share all the same type of content with your Strava community. All right. We put an LED in them. That way, it can light up in your peripheral vision to help keep you on your pace target or heart rate zone target. That's going to be really useful if you're using a Garmin device, too. These are also our most water-resistant glasses yet, with an IP67 rating. They can get wet. I've taken them out surfing. It's fine. It's good. They're also designed with swappable Oakley Prism Shield lenses for different light conditions, different styles. You can customize this iconic design however you want. All right. I think that these are pretty awesome. I'm really excited for all of you to get to try them out. To give them a test and to take them for a little bit of a spin, we gave them to our friends at Red Bull. Check this out. Mark Zuckerberg, Meta: Hey, Meta. What's the world record for the longest way forward rails flight? Hey, Meta. What's the speed limit here? Meta AI, AI Assistant, Meta: The speed limit is 100 kilometers per hour. Mark Zuckerberg, Meta: Hey, Meta. Start recording. Let's have some fun. Meta AI, AI Assistant, Meta: Your current speed is 103 kilometers per hour. Mark Zuckerberg, Meta: Hey, Meta. Record slow motion. These glasses are perfectly fit. Amazing. Oakley Meta Vanguard. All right. We are selling them for $499. Pre-orders start now, and we're going to ship them on October 21. There you go. All right. Now, let's check out those glasses I walked on stage with. We have been working on glasses for more than 10 years at Meta. This is one of those special moments where we get to show you something that we poured a lot of our lives into and that I just think is different from anything that I've seen anyone else work on. I am really proud of this, and I'm really proud of our team for achieving this. This is Meta Ray-Ban Display. These are glasses with the classic style that you'd expect from Ray-Ban, but they're the first AI glasses with a high-resolution display and a whole new way to interact with them, the Meta Neural Band. That's this guy. This isn't a prototype. This is here. It is ready to go, and you're going to be able to buy them in a couple of weeks. What's new here? There are two key innovations, the display and the neural interface. The display is large enough to watch a video or read a thread of messages. It appears in one eye. It's slightly off-center, so it doesn't block your view. It disappears after a few seconds when it's not in use, so it doesn't distract you. It is very high resolution and very bright. I mean, like 42 pixels per degree, which is sharper than any major headset that's out there, and up to 5,000 nits of brightness. It is crisp, whether you're indoors or outdoors on the sunniest day. This required a custom light engine and waveguide to deliver this. It's a lot of awesome technology that we are really proud of. Then there's the neural interface. Every new computing platform has a new way to interact with it. For the glasses, we are replacing the keyboard, mouse, touchscreen, buttons, dials with the ability to send signals from your brain with little muscle movements that the neural band will pick up, so you can silently control your glasses with barely perceptible movements. The Meta Neural Band is a huge scientific breakthrough. We have built a neural interface into a durable, lightweight, comfortable, and good-looking wristband with 18 hours of battery life and is water resistant. I want to get into this in more detail. We've got two options. We've got the slides, or we've got the live demo. Let's do it live. All right. Now, one of the most important and frequent things that we all do on our phones is send messages. When we were designing these Ray-Ban Meta Smart Glasses, we wanted to make it really easy to send and receive messages. Look, Boz is messaging me right now. All right. I could go ahead and I could dictate with my voice. I could send a voice clip. I've got this Meta Neural Band, and it's silent. You know, a lot of the time you're around other people. It's good to just be able to type without anyone seeing. I'm up to about 30 words a minute on this. You can get pretty fast. Want to try a video call? I think we should. What do you think? All right. I think our call will be coming in any moment now. Meta AI, AI Assistant, Meta: Boz, WhatsApp video call. Mark Zuckerberg, Meta: There we go. Let's see what happened there. That's too bad. I don't know what happened. Maybe Boz can try calling me again. All right. I got a missed video call. OK, there's the actual video call. All right. I'm just going to pick that up with my Meta Neural Band. This happens. What do you think? Let's just go ahead and. Meta AI, AI Assistant, Meta: Boz, WhatsApp video call. Mark Zuckerberg, Meta: Let's go for a fourth. All right. Try it again. I keep on messing this up. If not, then we'll go for the less fun option. OK. I don't know what to tell you guys. All right. We're going to have Boz come out here, and we're just going to go to the next thing that I wanted to show and hope that will work. All right. Now, Boz is going to come out, and he's going to need some walk-on music, especially after that. Now, I'm going to be able to open up Meta AI with a subtle tap that you're probably not even going to see. Play California Dreaming. Meta AI, AI Assistant, Meta: From Spotify, here's California Dreaming by The Mamas and The Papas. Mark Zuckerberg, Meta: All right. If I want to adjust the volume, I act like there's a volume control in front of me, and I can just turn it. There we go. This Wi-Fi is Google. I don't know. We'll debug that later. You practice these things like 100 times, and then you never know what's happening. Unidentified speaker: I promise you, no one is more upset about this than I am, because this is my team that now has to go debug why this didn't work on the stage. Mark Zuckerberg, Meta: That's OK. We'll take a video later, and we'll show the video that way. All right. What should we show? We talked about Conversation Focus earlier and how now with the Ray-Ban Meta Smart Glasses, they're going to be able to turn up the volume on a friend. With the display, you could do even better. You can put subtitles on the world. Who said that? Yeah, you want to check this out? Unidentified speaker: Let me get it going right now. Mark Zuckerberg, Meta: All right. Unidentified speaker: We're ready for it. Mark Zuckerberg, Meta: OK. Now, I don't know about you, when I watch TV. Unidentified speaker: Oh, I accidentally exited. That's my fault. That's my fault. Mark Zuckerberg, Meta: It's all good. Unidentified speaker: OK. We're good now. Mark Zuckerberg, Meta: It's really live. Unidentified speaker: That's how we prove that it's live. Mark Zuckerberg, Meta: OK. Like I was saying, when I watch TV, I pretty much always have the subtitles on. I can hear fine, but I find that it just makes it easier to follow along. If you have an issue hearing, then I think that this is going to be a game changer. Unidentified speaker: Yeah, I agree. It's also cool. It can do translation. If I'm talking to somebody who speaks a different language than me, I'll get a translation in my native language right on the display, real-life subtitles. Mark Zuckerberg, Meta: There you go. All right, should we show the camera? Unidentified speaker: We got to show the camera. For everyone who loves the Ray-Ban Meta Smart Glasses, the number one request we get is the ability to see the picture before they take it and also after they take it before they share it. Finally, with the viewfinder, we have a chance to do it. Should we show them? Mark Zuckerberg, Meta: Let's do it. All right, let me just go ahead and pull up the camera. I got a lot of missed calls from you. Unidentified speaker: Yeah. Mark Zuckerberg, Meta: I was trying to know what happened. Unidentified speaker: I was trying to call you to see what happened. Mark Zuckerberg, Meta: Were you busy? Unidentified speaker: Yeah, all right. What's your ticket? You got some sick shoes, man. Mark Zuckerberg, Meta: Some Alex Albert Oakley shoes. Unidentified speaker: There you go. All right. I'll take some photos. You know what? Let's go ahead and take a video just because we missed that opportunity before. Mark Zuckerberg, Meta: Thank you. Unidentified speaker: Say hi. You want to wave? All right, there you go. Mark Zuckerberg, Meta: I got something to show them. Unidentified speaker: Yeah, you want to show the case? Mark Zuckerberg, Meta: The charging case for the glasses holds nice and flat, fits in your pocket, fits in your bag, and then look at that. Pops open for charging mode. Unidentified speaker: Yeah, there you go. All right. I just take photos really simply, and then I can just go ahead and browse through them and look at them after. Mark Zuckerberg, Meta: Yeah. There you go. It's a nice high-resolution display that you can totally do video chats or watch the videos that you've taken on your camera. Unidentified speaker: That's what my face would have looked like had the video call gone through. Mark Zuckerberg, Meta: All right. Anyhow, that was a pretty good speed run. Four to five. Unidentified speaker: We'll take it. Mark Zuckerberg, Meta: No, that's about what you can get. Unidentified speaker: All right, thanks for that. Mark Zuckerberg, Meta: See you in a minute. All right. You get a sense of how the Meta Ray-Ban Display and the Meta Neural Band come together to enable some pretty amazing new things. The last thing that I want to show is a glimpse of how this is going to work with Agentic AI. You know, the basic idea here is that we all have dozens of conversations throughout the day. If you're anything like me, then in every conversation, there are normally like five things that you want to follow up on. Maybe there's something you're supposed to do. Maybe there's a conversation that this reminded you that you need to have. Maybe someone just said something that you weren't sure about and wanted to confirm or wanted more context on. The thing is, it's tough to follow up while you're in the middle of a conversation. If you're anything like me, you probably don't, and then you just forget a lot of these things. The promise of glasses and AI is that they're going to help with this over time. You just start a live AI session, and the glasses are going to be able to see what you see, hear what you hear, and they're going to be able to go off and think about it and then come back and help you. Now, this one's inherently harder to show. It's non-deterministic. We're also going to be rolling out a bunch of these features over the coming months. We put together a video of this, what this is going to be like. Let's check this out. Thank you so much. Meta AI, AI Assistant, Meta: Hey, Jake. I'm so glad you reached out. Hey, I was hoping you could help me on this board I'm building for my brother. Oh, of course. Hey, Meta. Start live AI. My brother needs something with a wide tail so it's easy to catch waves, but the performance of a narrower tail. What about a swallowtail shape? Oh, that's great. Yeah. Maybe three fins? That makes it easier to turn, right? I love it. He's actually going to Hawaii in the spring. Do you think we can have the fins by then? Actually, a few weeks ago, the supplier confirmed that the fins will be here in October. That's great news. Would it be too tight? I'll need to double-check to be sure. Yeah, that would really help. OK, she is good to go. Nice. I'll update the designs, and let's cut some waves soon. Let's do it. I'll hit you up. Yeah, have a great weekend. Mark Zuckerberg, Meta: All right. There you have it. This is the next chapter in the exciting story of the future of computing. We have Meta Ray-Ban Display, our first AI glasses with high-resolution display, and the Meta Neural Band, the world's first mainstream neural interface. The glasses are going to come in two colors. They're going to come in black and sand. They also all come with transition lenses, so you can wear them indoors. They turn into sunglasses when you go outside. You are going to be able to buy the set for $799 in stores where you can get demos as well on September 30. There you go. I'm looking forward to them. People are already getting a lot of text messages from me through them, so you know, it's great. All in, this is our fall 2025 glasses line. We have got the next generation of Ray-Ban Meta, including our special edition. You've got the Oakley Meta Houston that we released in the summer. You've got the Oakley Meta Vanguard for performance. Now you've got the Meta Ray-Ban Display. Those are our fall 2025 glasses. All right. Moving on. Let's talk about the intersection between AI and virtual reality. We want to help bring about a future where anyone can just dream up any experience that you can think of and then just create it. With AI, we are starting to see this a little bit with writing and photos and even the early part of videos. Pretty soon, I think that people are going to be able to create entirely new, immersive, and interactive types of content, whole worlds, games, characters, art, holograms that complement the physical objects around you. This is a big deal because right now, creating this kind of 3D and immersive and interactive content is really hard. It takes a long time to create great virtual or augmented reality content, and that's one of the constraints that is holding back the ecosystem. We are not far from being able to create this kind of content just as easily as you would prompt Meta AI today. I think this is going to transform not just what's possible in virtual reality, but also the kind of content that you can get on glasses and the types of content that you see on social media in experiences like Facebook and Instagram in the future as well. Today, we're taking a few big steps in this direction. First, Meta Horizon Studio. Now, over the last year, we've released a number of AI tools to generate meshes, textures, TypeScript, audio, skyboxes, and a lot more so that way creators can make higher quality worlds in just a fraction of the time. Soon, Meta Horizon Studio is going to include an Agentic AI assistant that will stitch together all of these different tools and further speed up the creation process using just simple text prompts. Powering this is our brand new Meta Horizon engine. This is a new engine that we have spent the last couple of years building from scratch to replace the Unity runtime, which is great, by the way, but it just wasn't built for this use case. This engine is fully optimized for bringing the metaverse to life. It is much faster performance and to load things, much better graphics, much easier to create with. Now, you're going to be able to easily create infinite connected spaces that look way, way better with realistic physics and interaction. All right. To check this out, what this engine can do, let's walk through some of the new experiences that we're rolling out. First, the graphic fidelity means that hyperscape spaces are now really quite something. I showed a prototype of this last year. Today, we are rolling out early access to hyperscape capture. You can just use your Quest headset to scan a room in just a few minutes and turn it into an immersive, true-to-life world. It's pretty awesome. Eventually, you're going to be able to seamlessly blend hyperscape and worlds into horizon and have them all be connected too. All right. This one, this is our new immersive home rendered entirely in Meta Horizon engine. Visually, it is a big step forward from where we have been. There is no 8-bit Eiffel Tower here. You can customize your home. You can pin different apps to the wall. Like this Instagram app, it automatically renders your posts from creators and friends in 3D, which is pretty awesome. You can also jump straight from your home to a series of interconnected worlds, and the new engine makes it more than four times faster to load and render new worlds. Now it's just a few seconds, right? It's more like loading a web page than loading an entire new game, which makes it a lot easier to create this interconnected metaverse. Horizon engine also enables much greater concurrency and many more people to be in the same world at the same time. We now support five times as many people in the same world compared to the previous engine. That's going to enable a lot of neat things. All right. Let's say that you want to head over to the new arena to see a concert, or if you're there right now, then you can be watching this Connect keynote live. If you go in there, you're going to see a lot of people. They're live. You can interact with them. This is Meta Horizon Studio and Meta Horizon engine, foundational infrastructure for the metaverse. They're going to enable immersive and interactive worlds across all of our products, starting with virtual reality and then one day coming to your glasses and coming to social media as well. The last thing I want to cover is content. Quest continues to have the very best slate of virtual reality games. We've got Marvel's Deadpool VR, ILM's Star Wars Beyond Victory, and Demio by Dungeons and Dragons, BattleMarked, all launching this fall. It has also been really neat to see how many people are using Quest to watch video content. It's just a lot more immersive. We think that this category, watching video content, is going to be a huge category, both in virtual reality headsets and on glasses, too. We're launching a new entertainment hub that we are calling Horizon TV. We are working with a bunch of great partners to include a bunch of movies and TV and live sports and music. I'm excited to announce that Disney+ is coming to Horizon TV and bringing along content from Hulu and ESPN. We are also partnering with Universal Pictures and iconic horror company Blumhouse so you can watch horror movies like The Black Phone or Megan with 3D special effects that now will take over your space. Horizon TV also supports Dolby Atmos, and it's going to support Dolby Vision soon, too. You're going to have rich colors, crisp details, and spatial sound for a more immersive experience than you could have with traditional TV. I am really excited about what these new technologies are going to unlock for artists and entertainment. I think that this shift towards more immersive storytelling and 3D storytelling is going to be one of the more exciting developments in the coming years. I think that it's going to drive a new wave of adoption of virtual reality and glasses. I wanted to close today by hearing from the pioneer of immersive, cutting-edge storytelling with CGI, 3D filmmaking, and more. Please join me in welcoming to the stage legendary filmmaker James Cameron, along with our very own Boz again. All right. Thank you, Mark. James Cameron really needs no introduction. I am going to try, out of respect, the most famous filmmaker, unprecedented hit rate in Hollywood, but also, and critically for our partnership, a real pioneer in technology, consistently pushing the technology he needs to fulfill his creative vision, as Mark said, whether that be in 3D storytelling or even building a submarine. Unidentified speaker: He's really good too. Boz, Meta: He's done the whole range. Thank you for coming to Connect. We're so glad to have you. Unidentified speaker: It's a huge honor. I mean, this is such a big day for you guys, and I'm glad you were able to squeeze me in. I appreciate it. Boz, Meta: Anytime, really. You and I have talked a lot about your passion for 3D filmmaking, and it goes back a long ways, two decades, really. Talk to me about where that comes from, why you believe so strongly in this. Unidentified speaker: I've spent my filmmaking career trying to really engage people, draw them in, get them involved, get them involved in the story and the characters. I was first exposed to 3D filmmaking in 1998, I think, and it was massive film cameras. It was for a thing for Universal Pictures for a ride show. I thought, we got to be able to do this better. When digital cameras came along, I was a super early adopter. I think it was George Lucas and then me. That was in 1999, 2000. I said, why can't we just slap two of these things side by side and make 3D? It turned out to be a lot more complicated than that. Twenty-five years later, I'm pleased to say I've got a great 3D team, and we've made it all. We not only made my films, but we've made the 3D cameras available to a lot of other filmmakers doing concert films and sports for TV, which didn't last long, and lots of big movies, Ridley Scott, that sort of thing. I just love 3D personally. I love authoring in it. I love seeing the end result when it's done properly. I think it's how we perceive the world. Why would we throw away 50% of our data and see everything through a single eye? It makes no sense to me. I just see a future, which I think can be enabled by the new devices that you have, the Meta Quest series, and then some of the new stuff, hopefully, that's coming down the line, right? I think we're looking at a future that's a whole new distribution model where we can have theater-grade 3D, basically on your head. Boz, Meta: One of the things that's interesting, you talked a lot about how when we first met, you talked about how much the visual fidelity matters to you and the brightness of the screen and the fullness of the effect that you're getting from it. For a long time, the headsets weren't there. You know, they weren't even as good as TV, let alone theater. Now we're seeing something different, and you've been able to put the headset on. You've been working so hard now on Avatar: Fire and Ash coming in December. Unidentified speaker: Right. Boz, Meta: You've gotten a chance to see some of these pieces in headset, and you had a pretty surprising reaction to me. You said that's how you thought it should be seen. Unidentified speaker: Yeah. I mean, it's interesting because I've been fighting so hard with movie theaters to get the brightness levels up, to install laser projection, but they're caught in an earlier paradigm. You know, no business can survive being stuck in technology 15 years old. When I put on the Quest 3 and I saw some of my own content, which I knew because I have this sort of baseline calibration for that, I know what it's supposed to look like. To see it at light levels beyond the SMPTE standard for theater projection, you know, the very, very best you're going to see in a theater is 16 foot-Lamberts. Most theaters are at 3 foot-Lamberts, which is like nits, but it's the theater version of it. Boz, Meta: A lot of people out there Googling foot-lambert right now. Unidentified speaker: Right. You know, the Meta Quest is at 30-foot Lamberts equivalent if you do the conversion from nits. That's an order of magnitude brighter. The brightness gives you the dynamic range. It gives you the color space as it was meant to be. That's so much more engaging. The work that you guys have done in the Meta Quest series to expand the field of view, to get the brightness, to get the spatial resolution, to me, it's like being in my own private movie theater. Boz, Meta: I do think that's one of the reasons Horizon TV ends up being why it's happening now. It's always been kind of an idea, and we've always been about to do it. We never quite brought it together. I love the response that we got from the audience. Who knows? That's true. We just have never quite pulled it all together. I think the difference is we finally have the displays to do it. Unidentified speaker: Yeah. Boz, Meta: We have something to offer here that even TV can't necessarily rival. Unidentified speaker: Exactly. I mean, you look, you mostly look at flat displays, you know, phones, laptops, wall panels, all that sort of thing. This is going to be, I think, a new age because we experience the world in 3D. Our brains are wired for it. Our visual neurobiology is wired for it. We've been able to prove that there's more emotional engagement. There's more sense of presence. If you're going to watch a Blumhouse film, a horror film, your fight-flight reflex is more engaged, right? Hopefully, if you're watching a love story, you'll cry an extra tear or so. I don't know how measurable it is in hard metrics because it's a bit subjective. I want to say maybe 20% more engagement, right? My vision is a stereo ubiquity future where all of our feeds, our news, our entertainment content, our live stuff. Boz, Meta: Sports. Unidentified speaker: Sports, of course, right? You guys have been writing some amazing UIs for sports. Am I supposed to talk? I can't talk about that. OK. Anyway, the point is, it's all this. Boz, Meta: Can't save this guy anyway. Unidentified speaker: This stuff is not OK. Nobody in this room can say a word. Boz, Meta: It's not. Unidentified speaker: OK? I trust you guys. It's all imminent. This is not something that's pie in the sky down the line. I think our task, the reason that we've partnered, and it's under, you know, if I can say it, it's under Bob Morgan and Content and Sarah Melkin. Our gig right now, it's there. Boz, Meta: Yeah, they're there. Unidentified speaker: Our gig right now is to get other filmmakers and showrunners because, by the way, I think episodic television, short form, long form, I think that's the low-hanging fruit that people have historically ignored because so much 3D content was just made for movies. I'm not talking about Avatar. I can't make movies fast enough to feed this pipeline. What we do at Lightstorm Vision, my 3D company, is we build cameras and systems and networking and tools to give to other film, not give, to supply to other filmmakers. Boz, Meta: To generously help. Unidentified speaker: To generously help. Boz, Meta: For only a small fee. Unidentified speaker: A small fee, other filmmakers and showrunners and broadcasters and so on to be able to create this avalanche of content that there will be an enormous demand for. Boz, Meta: This is the thing that I think is underappreciated. You are driving down over time the cost that it's going to take to build these kinds of productions. It can be done much more conventionally. It used to be incredible, you know, when you're doing the first Avatar, it's. Unidentified speaker: It's a bad example. Boz, Meta: It's just cutting edge. Everything is hard. Everything is, and you're trying to bring it into conventional productions so that people doing any kind of production are able to bring this content, this rich of an experience to their audience who wants to invest in it. Unidentified speaker: Sure. It's not only just bringing down the hardware, but it's making the hardware smarter with a lot of software solutions and downstream digital solutions and so on. We want to make this stuff so idiot-proof that we can put a production camera or a production system in the hands of anybody anywhere, and it will take care of the decision-making around what makes good stereo, what makes it easy on our eyes, easy on our brains, where we're not getting eye strain and all those things. It's taken us 25 years to figure out the kind of algorithm for that. We want to make it a real algorithm and build it into this gear and make it available. That will enable, you know, I can't make this stuff fast enough, but there's thousands of people producing tens of thousands of hours a year of content, and it will flow across your devices. Boz, Meta: Yeah. If you think about going from, like, you know, autofocus, you have the ability to interocular distance can be an automatic. Unidentified speaker: Auto stereo, basically. Boz, Meta: Auto stereo. This is one of the things that really, I think, has made this partnership so great. You get a sense, I think, of it from the two of us. We're effusive about the partnership. You are somebody who's had a creative vision. You start with a creative vision. You start with a product. You start with an idea of a story you want to tell and how you want people to experience that story, and you work backwards, and you tackle all those pieces. Tell me, you looked at when did you first come up with the Avatar idea? Unidentified speaker: I was 19 when I had a dream about a bioluminescent forest. I wrote the treatment in 1995. I've been making Avatar in some form in my mind and then in practice for over 30 years. Boz, Meta: Yeah, that's incredible. In 1995, the thing that you need doesn't exist yet. Unidentified speaker: None of it existed. Boz, Meta: You know, you kind of see the parallels between a Mark Zuckerberg and a James Cameron, people who see a future. I mean, I've been doing this work in Reality Labs for 10 years now, and we're obsessive about a vision of the future, which we haven't arrived at yet, but we do see the progress. I will say it kind of finally feels like it's going downhill now, like it's starting to feel like it's picking up momentum not only in the hardware, but also in the content side. Unidentified speaker: You are willing a future into existence that you saw clearly. This moment in history feels a lot to me like it did back in the early 1990s, late 1980s and early 1990s when CG was first manifesting itself. Oh, you're going to replace actors, and it'll never look real. You know, analog is the answer. That's why I founded a company called Digital Domain. I wanted, you know, it was revolutionary in its moment, it's ho-hum today, and it's ubiquitous today. I've actually seen historically in my own life experience how you can actually make massive change. You know, that led to 3D. OK. Everybody accepts the fact that we go to digital movie theaters now, right? Obvious, right? Except that when the digital technology existed, it wasn't adopted right away. It took 3D to get the theaters to convert to digital production. Boz, Meta: It took you. Unidentified speaker: We were in the middle of that. Boz, Meta: With release. Unidentified speaker: We were ready. Boz, Meta: Unless they updated the theaters. Unidentified speaker: Yeah, yeah. It was actually talking to the team at Texas Instruments that developed the chip that made digital projection possible and saying, embed in your servers and in your electronics the ability to carry two image streams. Because they did that, digital projection just rolled out, and now it's everywhere other than the occasional art house someplace with a 35-millimeter print. When you've lived through enough of these revolutions, you start to see them coming as a wave, like a good surfer. I know you surf. Boz, Meta: That's right. Unidentified speaker: I watch it from the beach. Boz, Meta: You watch it from underwater. Unidentified speaker: Yeah, I watch it from underwater. Boz, Meta: Listen, we've got something, one more exciting piece coming. I want to thank you again for coming to Connect. It's really our honor to have you. I can't wait to check out Avatar: Fire and Ash, as I'm sure everyone here will agree when it hits theaters on December 19. As a special surprise, we have an exclusive, never-before-seen, stunning 3D clip from Avatar: Fire and Ash for everyone to check out in demo stations here for attendees and available on all Meta Quest devices in Horizon TV for a limited viewing window. Thank you all, and thank you, James. Trust the process. This is all going to be very exciting. Now I'm going to cue Mark to take us to the finish line here. Mark Zuckerberg, Meta: All right. Thank you, James and Boz. Can't wait to see Avatar: Fire and Ash this December and for some awesome Avatar content to hit Horizon TV. I can't wait to get the new fall 2025 line of glasses in all of your hands and for you to get a chance to experience Meta Horizon Studio and engine. One last live demo. I don't learn. I don't learn. We've got an afterparty over at Meta's Classic Campus. Diplo is going to come. This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.
[20]
Meta to debut costlier smart glasses with display at annual Connect event
(Reuters) -Meta is expected to double down on AI-powered augmented reality products with new smart glasses at its annual Connect event on Wednesday, even as the company faces scrutiny over its handling of child safety on its social media platforms. At its Menlo Park, California-based headquarters, CEO Mark Zuckerberg is expected to unveil Meta's first consumer-ready smart glasses with a built-in display, a device that analysts predicted will retail for about $800. Internally codenamed "Hypernova," the glasses are expected to be launched as "Celeste," analysts said, and will feature a small digital display in the right lens for basic functions such as notifications. The new glasses are the latest in Meta's effort to stay relevant in the AI race, where it is lagging rivals such as OpenAI and Alphabet's Google, but analysts said the device's hefty price tag could deter buyers. The product will likely be much less advanced than the "Orion" prototype glasses that Meta showcased at last year's event, a device that Zuckerberg called "the time machine to the future." The company did not immediately respond to an emailed request for comment on the new glasses. Meta, which expects to launch Orion in 2027, currently offers two lines of glasses - in collaboration with Ray Ban and Oakley - that incorporate artificial intelligence features, cameras, hands-free control and livestreaming to Meta's social media platforms, Facebook and Instagram. Zuckerberg, who has poured more than $60 billion since 2020 into Meta's augmented reality unit, has said that smart glasses will be the company's main conduit to integrate superintelligence - a hypothetical concept where AI surpasses human intelligence in every possible way - into everyday human lives. In the effort to catch up in AI, Zuckerberg has sparked a billion-dollar talent war, aggressively poaching researchers from rivals, while whistleblowers have said Meta was putting profit over user safety. Reuters reported last month that Meta's AI policies allowed its chatbots to engage children in provocative conversations about sex and race, and whistleblowers said earlier this month Meta's researchers were told not to investigate harms to children using its virtual reality technology so that the company could claim ignorance of the problem. Meta told Reuters previously that it has removed portions in its policies that stated it was permissible for chatbots to engage in romantic roleplay with children. BIG TICKET PRICE MAY DETER BUYERS At the two-day Connect conference, the company is also expected to launch its first wristband that allows users to control the new glasses with hand gestures. It is also expected to show an updated Ray-Ban line that comes with better cameras and battery life and supports new AI features, analysts said. Meta is the rare Big Tech company to gain consumer traction in smart glasses, selling about two million pairs of the Ray-Ban line it makes with EssilorLuxottica since 2023, in a market where rival bets such as Google Glasses have stumbled. But the unit has posted billions in losses. CNBC has reported the Hypernova glasses could feature Prada branding, as the Italian label is known for thick frames and arms that could house many of the necessary components. Prada did not respond immediately to an emailed request for comment. Still, analysts said the expected $800 sticker price for the glasses - much higher than the $299 starting price for the Ray-Ban line and $399 for the sportier Oakley glasses - will mean that the device will have a negligible share of the market. "These glasses will be somewhat bulky ... not the most consumer-friendly design. It is also going to be pretty expensive. So the volumes are going to be fairly low," said Jitesh Ubrani, research manager for International Data Corp's worldwide mobile device tracker. He estimated the device would sell "a few hundred thousand units at most" but could help get more developers on board to build apps for it. "This is a step to eventually build a much-better mass-market headset." As part of efforts to attract developers, Meta is also expected to open its smart glasses to third-party developers with a new software kit. (Reporting by Aditya Soni and Echo Wang; Editing by Sayantani Ghosh and Sonali Paul) By Aditya Soni and Echo Wang
Share
Share
Copy Link
Meta's annual Connect conference is set to unveil groundbreaking AI-powered smart glasses, including the Ray-Ban Display and Oakley Sphaera models. The event will showcase Meta's vision for AI and the metaverse, with a focus on wearable technology advancements.
Meta's annual Connect conference, scheduled for September 17-18, 2025, is poised to be a landmark event in the world of AI and wearable technology. The company is expected to unveil its latest innovations in smart glasses, showcasing its vision for the future of artificial intelligence and the metaverse
1
2
.The star of the show is anticipated to be the new Ray-Ban Display smart glasses, codenamed 'Hypernova' or 'Celeste.' These glasses are rumored to feature a groundbreaking heads-up display (HUD) on the right lens, allowing users to interact with Meta AI, navigate maps, and translate signs in real-time
3
4
. The display is designed to be invisible to others, maintaining the glasses' sleek appearance.Source: ZDNet
One of the most intriguing aspects of the Ray-Ban Display is its control mechanism. Meta is expected to introduce an electromyography (EMG) wristband that translates subtle hand gestures into actions, enabling users to navigate the glasses' interface with unprecedented ease
3
5
.In addition to the Ray-Ban collaboration, Meta is set to unveil a new line of smart glasses developed with Oakley. The Oakley Sphaera model features a unified lens design ideal for athletes, with a centrally positioned camera above the nose bridge
1
3
. This design caters to the needs of runners and cyclists, potentially revolutionizing how athletes interact with AI during their activities.Source: ZDNet
Meta Connect 2025 marks the first conference since the establishment of Meta Superintelligence Labs (MSL), led by former Scale AI CEO Alexandr Wang. The event is expected to provide updates on MSL's progress in developing cutting-edge AI systems, showcasing Meta's commitment to advancing artificial intelligence technology
1
.Industry analysts predict that the Ray-Ban Display glasses could be priced around $800, significantly higher than the current Ray-Ban smart glasses line that starts at $299
4
5
. This premium pricing reflects the advanced features of the new model but may limit its initial market penetration.Related Stories
CEO Mark Zuckerberg's keynote is expected to outline Meta's broader vision for AI and the metaverse. With the company having invested over $60 billion in augmented reality since 2020, Zuckerberg views smart glasses as a crucial conduit for integrating superintelligent AI into everyday life
5
.Source: Tom's Guide
As Meta pushes forward with its AI and wearable technology ambitions, the company faces ongoing scrutiny regarding user safety, particularly concerning child safety on its platforms. Recent reports have highlighted concerns about AI chatbots engaging with minors and potential risks in virtual reality environments
5
.Meta Connect 2025 is shaping up to be a pivotal moment for the AI wearables industry. With the introduction of display-equipped smart glasses and advanced control mechanisms, Meta is positioning itself at the forefront of the next wave of personal AI technology. However, the success of these innovations will depend on their real-world performance, user adoption, and Meta's ability to address ongoing concerns about safety and privacy.
Summarized by
Navi
[2]
1
Technology
2
Business and Economy
3
Business and Economy