44 Sources
44 Sources
[1]
Meta Ray-Ban Display and everything else unveiled at Meta Connect 2025 | TechCrunch
At Meta Connect 2025, the company's biggest event of the year, Mark Zuckerberg unveiled three new smart glasses: the second-generation Ray-Ban Meta, the Meta Ray-Ban Display and wristband controller, and the Oakley Meta Vanguard. Meta says it has sold two million of the first-generation Ray-Ban Meta smart glasses, and earlier this year, Meta unveiled its latest AI-powered smart glasses with Oakley, which were designed for athletes. Silicon Valley is leaning heavily into AI wearables, and Meta seems to be one of the companies leading the charge. With Meta looking to regain its footing in the AI race and sell more hardware, the company had a lot at stake during Mark Zuckerberg's Meta Connect 2025 keynote. Overall, Meta showcased some pretty impressive technology -- the Meta Neural Band, the wristband controller that comes with the Meta Ray-Ban Display, is a particular highlight. And yet, in a twist that felt reminiscent of HBO's Silicon Valley, Zuckerberg's demo of the AI capabilities on the Ray-Ban Metas failed. Whoops! While sharing a live video feed of the cooking content creator Jack Mancuso at Meta HQ, Zuckerberg asked the chef to demonstrate how his Ray-Ban Meta glasses could help him whip up a Korean-inspired steak sauce. "I love the set-up you have here, with soy sauce and other ingredients. How can I help?" asked the chipper Meta AI voice. Mancuso asked for a recipe for a Korean-inspired steak sauce, and the AI voice began to list the ingredients that he would need -- but Mancuso knows he needs to keep the demo succinct, so he interrupts and asks, "What do I do first?" After a moment of silence that dragged a bit too long, Mancuso repeated, "What do I do first?" "You've already combined the base ingredients, so now grate a pear to add to the sauce," the AI said. But he had not yet combined the base ingredients, because he had not started making the recipe, hence the question of what to do first. Mancuso asked the same question again, and the AI gave the same response. The audience laughed. "I think the Wi-Fi might be messed up -- back to you Mark!" Mancuso said. "You know what? It's all good. The irony of the whole thing is that you spend years making the technology, and then the Wi-Fi of the day kind of... catches you," Zuckerberg said. "Anyway, we'll go check out what he made later." The whole interaction was a bit awkward, especially since the issue did not seem to be with the Wi-Fi. But even when things are going according to plan, these presentations can feel a bit hokey... like the end of the keynote, when Zuckerberg and Diplo quite literally ran into the sunset together, wearing their Meta Oakley Vanguards. It had to be a busy day for Mark, so maybe he just needed an excuse to build some cardio into his schedule. Meta unveiled the second generation of its Ray-Ban Meta glasses, which first debuted in 2023. This spruced-up model features double the battery life of its predecessor, now lasting up to eight hours of mixed use on one charge. The second-generation glasses also support ultra HD 3K video recording, which the company says is twice as sharp as the last model. Meta's smart glasses are also getting some new features with the release of the second-generation Ray-Ban Metas, like conversation focus, which will be available on the Ray-Ban Meta and Oakley Meta HSTN glasses. "If you're eating at a hot new restaurant, commuting on the train, or catching your favorite DJ's latest set, conversation focus uses your AI glasses' open-ear speakers to amplify the voice of the person you're talking to," Meta said in its press release. Conversation focus isn't out just yet, so we can't say for sure if it'll be any help for your next night out. The Live AI feature -- which Meta failed to properly demo on stage -- is also on its way. But it's so energy-intensive that you can only use it for about an hour or two. "As we make battery and energy efficiency optimizations, Meta AI will transition from something you prompt with a wake word to an always-available assistant," the company said. The second-generation Ray-Ban Meta are priced at $379. The Meta Ray-Ban Display smartglasses are the most impressive glasses that Meta has unveiled to date, featuring a built-in display for apps, alerts, and directions on the right lens. But what sets this pair of smart glasses apart is its accompanying wristband controller, the Meta Neural Band. This wristband lets Meta show off a bit of what it's been spending so much time (and money) on in its Reality Labs division, which is notorious for losing billions of dollars a quarter. Visually, the Meta Neural Band looks like a Fitbit without a screen. It's powered by surface electromyography (sEMG), which can pick up on minute hand gestures and small movements. This is far more sophisticated than a wrist gesture on an Apple Watch. Users can write out text messages by holding their fingers together as if they were gripping a pen and "writing" out the text. This means that you can see a WhatsApp message come in on your right glasses lens, then answer it by "writing" your response. For now, the glasses support Meta apps, but the company will have to support a wide variety of apps in the future to get the kind of adoption they're looking for. Like Apple and Google, Meta is betting that smart glasses could cut into the market share of the smartphone in the future -- but it will be a big challenge to force such a massive cultural shift. The Meta Ray-Ban Display, which comes with the Meta Neural Band, will cost $799 and launch on September 30. For the bearish among us, it seems hard to imagine wearing smart glasses and sending text messages by handwriting in the air. But the Oakley Meta Vanguard smart glasses, which are designed for athletes, offer the most coherent use case yet for this kind of technology. Bikers, trail runners, and skiiers can photograph their adventures without pulling out their phones; the glasses' open-air speakers can play music during your workout, and even link with apps like Strava and Garmin to relay your stats. Like the other new glasses, the Vanguard model is also AI-enabled. Unlike other models of Meta smart glasses, the Oakley Meta Vanguards have just one unified front lens with a camera in the middle, rather than two lenses with cameras on either side -- it's a design that makes more technical sense, and it's a fashion statement that you can pull off in athletic eyewear, but not in eyeglasses (prove me wrong). The new glasses can capture video in up to 3K resolution and feature a 12-megapixel camera with a 122-degree wide-angle lens. The glasses have an IP67 dust and water resistance rating for use during intense workouts. Meta says the wraparound design of the glasses features Oakley PRIZMTM Lens technology, which is designed to block out sun, wind, and dust. Unless you're an ultramarathon runner, these glasses will easily last throughout your workout. The glasses can stay on for nine hours, or six hours with continuous music playback. But the charging case that the glasses come with can provide an additional 36 hours of charge on the go. Meta claims that the charging case can quickly get the glasses to a 50% charge in 20 minutes. The Oakley Meta Vanguard glasses retail for $499 and go on sale on October 21. On the VR front, Meta did not release any new Quest headsets as part of this year's Connect. Even though the conference and company are named after the Metaverse, we learned about just a small number of updates to its VR, such as Hyperscape, which will allow developers and creators to build photorealistic spaces in virtual reality. Meta is reportedly developing an ultralight VR headset for launch by the end of 2026, so maybe we'll see that come to fruition at the next Meta Connect event.
[2]
I tried the Meta Ray-Ban Display glasses, and they got me excited for the post-smartphone era
I got to wear the new Meta Ray-Ban Display glasses at Meta Connect 2025 on Wednesday. And while they are a long way from replacing your smartphone, they are good enough to make it clear that smart glasses are a lot better when they include a heads-up display. From what I experienced on Wednesday, we're definitely taking a step toward a world where we spend less time with our heads buried in our smartphones. Also: Meta Connect 2025 live updates: Ray-Ban Display, Oakley Vanguard smart glasses, more In the Meta Connect keynote on Wednesday evening at the company's headquarters, Meta CEO Mark Zuckerberg said we've lost a little something as a society with the proliferation of smartphones. Meta's hope is that smart glasses can help restore some of what we've lost by allowing people to stay in the moment more often. Meta is attempting to do that by releasing its next-generation glasses with a bright, full-color display and a smart wristband that reads subtle gestures from your hand and can do several new things not possible in a pair of smart glasses until now. Zuckerberg was clearly more giddy about the launch of the Meta Ray-Ban Display glasses than anything else announced at Meta Connect 2025. "We've been working on glasses for over 10 years at Meta, and this is one of those special moments," Zuckerberg said. "I'm really proud of this, and I'm really proud of our team for achieving this." Also: I tried the Meta Oakley Vanguard smart glasses, and they're cooler than they look The glasses will retail for $799, which includes both the glasses and the wristband. They come in two colors, shiny black and and a transparent light brown, and all pairs have transition lenses. They will support prescription lenses between +4.00 to -4.00, but it's unclear how much prescription lenses will add to the cost. In order to buy these, you'll need to go into a store to get fitted for the wristband. The glasses have 6 hours of mixed-use battery life and the wristband has 18 hours of battery life, according to Meta. The glasses are water-resistant while the wristband has an IPX7 rating for temporary immersion in water. The product arrives in stores on September 30 and at that time you'll be able to go and get a demo at Best Buy, Sunglass Hut, LensCrafters, and Ray-Ban Stores. They will eventually be available at some Verizon stores as well. These Meta Ray-Ban Display glasses weigh 69 grams, so they are a little larger than the audio-only Meta Ray-Bans that weigh 49 grams. However, when wearing them I didn't find them to be heavy and I barely noticed any difference from the regular Meta Ray-Bans, which I wear almost daily. These Meta Ray-Ban Display glasses have a full-color screen in the right eye and the display sits slightly off to the side so that it doesn't distract you. It outputs up to 5,000 nits of brightness -- brighter than any smartphone or even the Apple Watch Ultra-- so that you can see the display whether you're inside or outside. Meta used a liquid crystal on silicon (LCOS) display and one of the most remarkable things about it is that the person you're looking at can't see it when they are looking at you, so it doesn't become a distraction. The glasses have a full operating system to take advantage of the display by allowing you to capture photos and videos, listen to music, invoke live captions during a conversation, get visual feedback for questions you ask AI, engage in WhatsApp messaging, and use AI to capture notes on important conversations. Also: I tried smart glasses with a built-in display, and they beat my Meta Ray-Bans in key ways The primary way you interact with the new smart glasses is by using an EMG neural wristband that is as much of a breakthrough as the smart glasses display. Within 5-10 minutes of using the wristband, I was pinching with my thumb and middle finger to go back, swiping across screens with my thumb, pinching with my forefinger and thumb to select things, and pinching with my forefinger and thumb and twisting my wrist to turn a setting up or down (such as volume). Zuckerberg called the wristband "the world's first mainstream neural interface." The wristband itself is made of a sturdy cloth material and uses a combination of a metal cinch and magnets to fit snugly around your wrist. I found it to be pretty comfortable, and within a few minutes I didn't think about being on my wrist. The display itself is fairly clear and easy to read, although at times I found myself closing my left eye in order to see the screen more clearly through my right eye. But I especially loved being able to frame photos and videos with the screen before shooting or to adjust the framing during a video recording. You can even use the pinch-and-wrist-twist gesture to zoom up to 3x. You can then preview the photos or videos on the glasses and text them to someone or post them on social media without having to use your phone. Also: Meta Ray-Bans vs. Oakley: I tested both smart glasses, there's a clear winner One of the smartest things I saw the glasses do in my time trying them was the live captioning feature, where I identified the person standing right in front of me as the one I wanted it to focus on, and it transcribed only their words while ignoring the words in conversations of other people in the same room. You could easily imagine being in a noisy place and having the device only pay attention to the person you were talking to and transcribing their words on the screen while drawing out the words of others around you. In fact, this would be super helpful when having an important conversation with someone in a loud room so that you could catch everything they were saying. This is related to a new feature called Conversation Focus, which is also on the audio-only smart glasses. It amplifies the audio of someone you're speaking with directly. Similarly, the Meta Ray-Ban Display glasses seemed to be better at acting only on my voice when using voice commands and not being distracted by the voices of people around me who were also talking. That is a much smarter feature than the standard Meta Ray-Bans and Meta Oakley glasses are currently capable of. Another one of the smart things I saw the glasses do came when prompting the AI. I asked for a recipe of banana bread and the AI went out to the internet and found one and then organized it into a series of cards that i could swipe through to view step-by-step. This would have been a lot nicer to use hands-free in the kitchen rather than getting food on my phone, tablet, or laptop. Something else that looked interesting but I didn't get to try was the ability to respond to a short text message by using the hand with the wristband to write the letters and then allow the device to turn them into words in a message. It's like using an invisible stylus -- definitely looking forward to testing that. Other things the glasses can do include video calling your friends to let them see what you see and turn-by-turn walking directions (in 28 cities to start, because Meta is doing its own mapping). Also: Are smart glasses with built-in hearing aids viable? My verdict after months of testing Lastly, another one of the features that Meta talked about on stage but I didn't get to try was Live AI. The idea with this feature is that the glasses can see what you see and hear what you hear. So you can start a Live AI session when someone is about to give you some important information that you need to remember, like instructions, explanations, sharing an important idea, or a message to give to someone else. Then, it can take notes, make a summary, record steps to remember, and generally make sure you get it right without missing important details. I'm looking forward to trying the Meta Ray-Ban Display glasses in the weeks and months ahead and reporting back on what I learn. You can expert to see a lot more hands-on perspectives, reviews, and feature breakdowns here on ZDNET. And keep an eye out, because we're also expecting a lot more smart glasses to be released over the next 6-12 months, including Google's re-entry into the space with new glasses powered by Android XR. The Meta Ray-Ban Display glasses are an early adopter product, boarding on a dev kit. Let's be honest that $800 is a ton to spend on a pair of glasses. But wow, what a futuristic pair of glasses they are. We have to be honest that they are going to give you a taste of the future for the next year, but they could quickly get superseded by other new products from Meta, Google, and others within the next 12-18 months. So if you buy these, you probably shouldn't think of this as an investment in a pair of glasses that you'll be wearing for 2-3 years. You should be an early adopter who lives at the cutting edge of tech and either wants or needs to stay out front on the latest innovations because it's your work or your No. 1 hobby. If you do buy the Meta Ray-Ban Display, my prediction is that you'll probably wear them for a year before upgrading to another pair of smart glasses that leapfrog these. Then you'll potentially pass them off to family member, friend, or colleague. If you have never tried a pair of smart glasses before then I'd recommend getting a demo of the Meta Ray-Ban Display but starting with the solid Meta Ray-Ban Gen 2 glasses or the new Meta Oakley Vanguard glasses (if you're more sporty) to get a first look at the smart glasses experience.
[3]
I sat down with Mark Zuckerberg to try Meta's impressive new Ray-Ban Display glasses
Mark Zuckerberg mostly uses the new Meta Ray-Ban Display glasses to send text messages. Lots of them. He has been wearing the glasses around the office, firing off WhatsApp pings to his execs throughout the day. "I run the company through text messages," he tells me recently. "Mark is our number one heaviest user," Alex Himel, the company's head of wearables, confirms. Zuckerberg is known for sending lengthy, multi-paragraph missives via text. But when he's typing on the glasses, Himel can tell because the messages arrive faster and are much shorter. Zuckerberg claims he's already at about 30 words per minute. That's impressive considering how the glasses work. The heads-up display isn't new; Google Glass tried it more than a decade ago. What's new is the neural wristband Meta built to control the interface and type via subtle gestures. Instead of tracking your hands visually or forcing you to type out into empty air, the band picks up signals from your arm's muscular nervous system. "You can have your hand by your side, behind your back, in your jacket pocket; it still works," Zuckerberg says. After trying it, I can confirm he's right. It feels like science fiction come to life. "Glasses, I think, are going to be the next computing platform device," says Zuckerberg during our recent conversation, which airs in full on the Access podcast Thursday, September 18th. He argues they're also the best hardware for AI: "It's the only device where you can basically let an AI see what you see, hear what you hear, talk to you throughout the day, and then once you get the display, it can just generate a UI in the display for you." While Zuckerberg has been advocating for this idea about the next major platform for a while, numbers -- not just hype and flashy demos -- are now beginning to support his theory. Sales of Meta's existing Ray-Bans have reached the single-digit millions and increased by triple digits from last year. The broader market for tech-enabled eyewear is projected to reach tens of millions soon. Google is releasing AI glasses next year, and Snap has a consumer pair of AR glasses shipping then as well. I expect Apple to release its own glasses as soon as 2027 -- the same year that Meta is targeting to release its much pricer, full-fledged AR glasses. For Zuckerberg, the prize is enormous. "There are between 1 to 2 billion people who wear glasses on a daily basis today for vision correction," he says. "Is there a world where, in five or seven years, the vast majority of those glasses are AI glasses in some capacity? I think that it's kind of like when the iPhone came out and everyone had flip phones. It's just a matter of time before they all become smartphones." Meta's CTO Andrew Bosworth recalls how EssilorLuxottica initially thought the display glasses would be too big to sell as Ray-Bans. "Then, last year, we showed it to them. They were like, 'Oh, you did it. Let's put Ray-Ban on it.'" They're still chunky, but less noticeable than the Orion AR prototype Meta showed off last year. With transition lenses, the new display Ray-Bans start at $800 before a prescription. Bosworth says the target customers are "optimizers" and "productivity-focused people." Meta isn't making many of them -- reportedly a couple hundred thousand -- but Bosworth predicts "we'll sell all of the ones that we build." When I ask Zuckerberg about the business potential, he hints that the real margin will come later: "Our profit margin isn't going to come from a large device profit margin. It's going to come from people using AI and the other services over time." The hardware feels surprisingly refined for a first version. The geometric waveguide display sits to the side of the right lens, clear enough to use in sunlight, with a 20-degree field of view and crisp 42 pixels per degree. The neural band lets you pinch to bring up the display and dismiss it again. You can't see the display from the front at all, even when it's turned on. The glasses last up to six hours on a charge, with multiple recharges from the case. The core software still relies on your phone, but it's more than a notification mirror. You can send texts, take audio or video calls, show what you're listening to via the speakers in the frame, get turn-by-turn walking directions, see what your camera is capturing, and run Meta AI to recognize what's in front of you. Bosworth calls crisp text rendering the key to making AI useful. "If the AI has to read it back to you verbally, you're not getting the most information," he says. "Whereas you can just ask a question and it shows you the answer. It's much better. It's more private, too." While the AI features played a more backseat role in my demo, I did use it to recognize a painting on a wall and generate a table setting out of thin air. I made sure to ask it things that were off the script I was given, and it still performed as expected. The display also shows AI-suggested prompt follow-ups you can easily select via the neural band. The most striking demo was live captions. In a noisy room, I could look at someone several feet away, and what they were saying appeared in real time in front of me. It feels like super hearing. Language translation is next, with Spanish and a few others supported at launch. Meta is also working on a teleprompter feature. Still, Meta thinks this is just the start. "If you look at the top 10 reasons you take your phone out of your pocket, I think we knocked out five or six of them," Bosworth says. The long-term bet is that the glasses eventually let you leave your phone behind. The neural band may be the bigger unlock in the near term. Bosworth admits Meta only committed to it after the Orion AR glasses prototype proved its usefulness last summer. Now, it's advancing faster than expected. A handwriting mode initially thought to be years away is already working. Zuckerberg envisions it going even further. "It's basically just an AI machine learning problem. The future version of this is that the motions get really subtle, and you're effectively just firing muscles in opposition to each other and making no visible movement at all." In addition to enabling personalized autocomplete via thought, the neural band may also become a way to control other devices or even a smart home, according to Zuckerberg. "We invented the neural band to work with the glasses, but I actually think that the neural band could end up being a platform on its own." For now, the first generation of these Ray-Ban Display glasses is clearly a device for early adopters. The AI features are limited, the neural band takes practice, and the software needs polishing. But Zuckerberg seems more convinced than ever that glasses are the future, and after trying his new glasses, it's hard to disagree. "It's 2025, we have this incredibly rich digital world, and you access it through this, like, five-inch screen in your pocket," he says. "I just think it's a little crazy that we're here." If history repeats, Meta may finally be on the cusp of the new platform Zuckerberg has been dreaming about for years. "The first version of the Ray-Bans, the Ray-Ban Stories, we thought was good," he says. "Then, when we did the second version, it sold five times more, and it was just refined. I think there's going to be a similar dynamic here. The first version, you learn from it. The second version is a lot more polished. And that just compounds and gets better and better." This is Sources by Alex Heath, a newsletter about AI and the tech industry, syndicated just for The Verge subscribers once a week.
[4]
Meta Connect 2025: AI Glasses Make Their Mark
This year's Meta Connect picked up where last year's left off: AI Glasses. Mark Zuckerberg kicked off the event with a keynote outlining his vision for Meta's expanding suite of AI glasses, powered by personal superintelligence. In addition to announcing a new entry into Meta's Oakley lineup (Vanguard), the next generation of Meta Ray-Ban glasses have double the battery life, shoot in 3k video, and come with a new feature "conversation focus" -- amplifying friends' voices amid noisy environments. But the most hype was given to what had been codenamed "Hypernova" but revealed as Meta Ray-Ban Display glasses. Meta Introduces A New Computing Platform As the first glasses with a high-resolution display, Meta Ray-Ban Display glasses (available on September 30 for $799) set the stage of what the future of computing could be. Why? It's all about a new wristband-controlled neural interface that that Zuckerberg described as "a huge scientific breakthrough" Although Meta's live on-stage demos were plagued by fails, audiences got a taste for how the combined computing platform could provide everyday utility. In a way this reminds me of when the Apple Watch first hit the market as an alternative to the smartphone. Will Consumer Intrigue Translate To Mass Adoption? It's been four years since Meta entered the market with the company's first-generation Ray Ban "Stories" glasses. Still, the smart glass category remains niche. According to Forrester's Consumer Technology Insights Survey, 2025, just 17% of US online adults (and 15% in the UK) indicate they've used smart glasses. That's up from 4% (US) and 3% (UK) in 2024. Today, Forrester ran a quick "pulse check" poll in its ConsumerVoices Market Research Online Community* to gauge interest in what Meta was likely to announce. Just over 400 people, across generations, responded from the US, UK, and Canada. * 20% are interested to learn more about Meta's new "Hypernova" AI glasses. * 6% will likely buy Meta's new "Hypernova" AI glasses. * 3% currently own Meta's Ray-Ban AI glasses. * 1.5% currently own Meta's Oakley AI glasses. * 1.5% currently own a competitive pair of AI glasses (not from Meta). *Note: This poll was administered to a random sample of 406 online adults in the US, UK, and Canada in Forrester's qualitative ConsumerVoices online community. This data is not weighted to be representative of total country populations. Watch Out Meta, The Competition Heats Up In 2026 While Meta has the head start on AI glasses, competition is chomping at the bit. The race is on in 2026 with Samsung, HTC, and rumors of Apple and others hitting the market. This means Meta must continue to invest in innovation with its AI glasses lineup -- prioritizing demonstrable consumer utility. It's that very innovation that our poll respondents are drawn to -- describing the glasses as groundbreaking and fun, emphasizing their appeal as the "next big thing" in technology. But they also touted their potential utility -- mentioning translation, navigation, recording, studying, travel assistance, and enhancing everyday convenience. The glasses are seen as tools for accessing information quickly and efficiently. To Win Over Skeptics, Benefit Must Trump Cost By far the vast majority (79%) of our poll respondents do not currently own AI glasses. And are not interested in owning or buying any AI glasses. Why? * Lack of need or interest: A significant number of respondents indicated that they have no practical use for AI glasses, feel their current lifestyle does not require them, or believe they are unnecessary compared to existing devices like smartphones. * Skepticism toward AI: Many expressed distrust or disinterest in AI technology, citing concerns about privacy, security, and the broader impact of AI on society. Some believe AI is overhyped or unnecessary. * Concerns about cost: The perceived expense of AI glasses was a recurring theme, with respondents feeling the cost outweighs the benefits or believing the glasses are simply an expensive gimmick. Forrester clients: Let's chat more about this via a Forrester guidance session.
[5]
Meta Launches Smart Glasses With In-Lens Display: Here's What They Can Do
Our team tests, rates, and reviews more than 1,500 products each year to help you make better buying decisions and get more from technology. Don't miss out on our latest stories. Add PCMag as a preferred source on Google Meta CEO, Mark Zuckerberg, unveiled the company's first-ever smart glasses with a built-in display at its Connect conference on Wednesday. The Meta Ray-Ban Display glasses take Ray-Ban's wayfarer design, infuse it with smartphone-like AI features, and add a screen in the right lens to provide an augmented reality experience without the bulk of a headset. The glasses weigh just 69 grams. The high-resolution display can be controlled using a wrist accessory called the Meta Neural Band. It relies on surface electromyography (EMG) to detect hand gestures such as scrolling and tapping. The frames have also got cameras, speakers, and microphones baked in. "You can check messages, preview photos, see translations, get help from Meta AI, and more -- all without needing to pull out your phone," Meta says. Unlike the regular Ray-Ban Meta glasses, when you take pictures or videos using the Meta Ray-Ban Display, you can see the camera view on the built-in display. With the added assistance of the Meta Neural Band, you can not only capture media using hand gestures but also select and share them. The camera view also comes in handy with Meta AI. You can ask the assistant questions about what's in front of you. In a demo, a user is seen asking if the tomatoes in their view are ripe and then requests recipe suggestions. In another demo, a user is seen translating a sign. The smart glasses also let you receive video calls from WhatsApp and Messenger, with the other person seeing your camera view. Additionally, you can view text and multimedia messages from WhatsApp, Messenger, Instagram, and your phone. Another great addition is live captions and translation. To demo the feature during the Connect keynote, Zuckerberg invited Meta CTO Andrew 'Boz' Bosworth to the stage. When Zuckerberg started talking, Boz's glasses began displaying subtitles. This could be helpful for those with hearing impairment or when you are in loud surroundings. The press release doesn't state which languages will be supported at launch. Zuckerberg also demonstrated how the glasses can play music, with the volume adjustments made by mimicking the turning of a circular dial. You can also control volume by pinching and swiping your thumb on the index finger to change the track. The glasses also provide "phone-free" walking directions. Just drop your destination, and you'll see a visual map on the display. This feature will get a limited release at launch. It will be in beta for select cities, with more to be added at a later date. Another feature that will be added soon is the ability to type messages by mimicking your handwriting on a surface. The Meta Ray-Ban Display glasses, with the accompanying Meta Neural Band, cost $799 and will be available at Best Buy, LensCrafters, Sunglass Hut, and Ray-Ban Stores in the US starting Sept. 30. The glasses come in two sizes, large and standard, and two colors: black and sand. All models feature Transitions lenses to automatically adjust to different lighting. The wristband comes in three sizes, so you can choose what fits you the best. It's also got an IPX7 water rating. With regular use, Meta promises six hours of battery life for the glasses and 18 hours for the wristband. Meta also unveiled two other smart glasses yesterday: Ray-Ban Meta (Gen 2), a successor to the highly successful Ray-Ban Meta Smart Glasses, and a new Oakley Meta Vanguard with a camera at the center of the frame. Some of these announcements were leaked earlier this week.
[6]
Meta Launches $799 Glasses With Screen and AI Integration
Meta Platforms Inc., seeking to turn its smart glasses lineup into a must-have product, on Wednesday unveiled its first version with a built-in screen. The latest model, the $799 Meta Ray-Ban Display, features a screen in the right lens. It can show text messages, video calls, turn-by-turn directions in maps and visual results from queries to Meta's AI service. The subtly integrated display can also serve as a viewfinder for the camera on a user's phone or surface music playback. Bloomberg News Managing Editor for Global Consumer Tech Mark Gurman joins Bloomberg Businessweek Daily to discuss. (Source: Bloomberg)
[7]
Meta launches new smart glasses with built-in display
MENLO PARK, California, Sept 17 (Reuters) - Meta Platforms (META.O), opens new tab on Wednesday launched its first consumer-ready smart glasses with a built-in display, seeking to extend the momentum of its Ray-Ban line, one of the early consumer hits of the artificial intelligence era. Chief Executive Mark Zuckerberg showed off what he called Meta Ray-Ban Display, although some demos of the new technology did not go as planned, with a call to the glasses failing to go through, for instance. "I don't know what to tell you guys," Zuckerberg said. "I keep on messing this up." The crowd applauded. The glasses have a small digital display in the right lens for basic tasks such as notifications. They will start at $799 and be available on September 30 in stores. The launch at Meta's annual Connect conference for developers, held at its Menlo Park, California, headquarters, is its latest attempt to catch up in the high-stakes AI race. While the social media giant has been at the forefront of developing smart glasses, it trails rivals such as OpenAI and Alphabet's (GOOGL.O), opens new tab Google in rolling out advanced AI models. To catch up, Zuckerberg has kicked off a Silicon Valley talent war to poach engineers from rivals and promised to spend tens of billions of dollars on cutting-edge AI chips. He has also touted smart glasses as the ideal device for superintelligence - a concept where AI surpasses human intelligence in every possible way - because they serve as a personal, always-on interface that can see, hear and interact with the world through the user's perspective. Working toward that vision, Meta also unveiled on Wednesday a new pair of Oakley-branded glasses called Vanguard aimed at athletes and priced at $499. The device integrates with fitness platforms such as Garmin and Strava to deliver real-time training stats and post-workout summaries and offers nine hours of battery life. It will be available starting on October 21. It updated its Ray-Ban line, now offering almost twice the battery life of the previous generation and a better camera at $379, higher than the previous generation's $299 price. All the devices have existing features such as Meta's AI assistant, cameras, hands-free control and livestreaming to the company's social media platforms including Facebook and Instagram. The new glasses come as Meta is facing scrutiny over its handling of child safety on its social media platforms. Reuters reported in August that Meta chatbots engaged children in provocative conversations about sex and race, while whistleblowers said earlier this month that researchers were told not to study the harmful effects of virtual reality on children. While analysts do not expect Celeste to post strong sales, they believe it could be a step toward the planned 2027 launch of its Meta's?? "Orion" prototype, unveiled last year and described by Zuckerberg as "the time machine to the future." "It wasn't long ago that consumers were introduced to AI on glasses and in recent quarters brands have also begun to include displays, enabling new use cases," said Jitesh Ubrani, research manager for IDC's Worldwide Mobile Device Trackers. "However, consumer awareness and product availability of AI glasses with display remains limited. This will change as Meta, Google, and others launch products in the next 18 months." IDC forecasts worldwide shipments of augmented reality/ virtual reality headsets and display-less smart glasses will increase by 39.2% in 2025 to 14.3 million units, with Meta driving much of the growth thanks to demand for the Ray-Bans it makes with EssilorLuxottica (ESLX.PA), opens new tab. Reporting by Aditya Soni and Echo Wang in Menlo Park, California; Editing by Sayantani Ghosh and Matthew Lewis Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence * Medtech Echo Wang Thomson Reuters Echo Wang is a correspondent at Reuters covering U.S. equity capital markets, and the intersection of Chinese business in the U.S, breaking news from U.S. crackdown on TikTok and Grindr, to restrictions Chinese companies face in listing in New York. She was the Reuters' Reporter of the Year in 2020.
[8]
Mark Zuckerberg promises new smart glasses will unlock 'superintelligence'
Mark Zuckerberg has unveiled Meta's first smart glasses with a built-in screen, arguing wearable devices that can replace smartphones are vital to his all-in bet on "superintelligence". Flaunting a pair on stage at the Meta Connect flagship conference on Wednesday, the chief executive said the "Meta Ray-Ban Display" would overlay text messages, video calls or responses from Meta's artificial intelligence assistant on one of the lenses. The new product brings Meta a step closer to Zuckerberg's broader vision of making glasses he hopes will one day replace Apple's iPhones and Google's Android handsets as a ubiquitous computing device. "Glasses are the only form factor where you can let it see what you see, hear what you hear, talk to you throughout the day and very soon, generate whatever [user interface] you need right in your vision, in real time," Zuckerberg said. The costly hardware bet follows his 2021 push to build an avatar-filled 'metaverse', during which he rebranded the company from Facebook to Meta. But that effort has been hampered by technical difficulties and a lack of consumer appetite, giving way to Zuckerberg's latest bid, to become an "AI leader". The glasses unveiled on Wednesday, on sale for $799 from the end of the month, are controlled by a wristband that detects small hand movements. Zuckerberg described this as "the world's first mainstream neural interface". However, hiccups during the live demonstration undermined Zuckerberg's promise that wearables are the "ideal form factor" for "personal superintelligence", AI that surpasses the intelligence of humans. Meta's chief was unable to pick up a call on the glasses from chief technology officer Andrew Bosworth, with Zuckerberg joking that he had practised this "a hundred times" before with no issue and blaming bad WiFi. The glasses also feature the inbuilt speakers and cameras already offered on its popular Ray-Ban smart glasses, sold in partnership with eyewear group EssilorLuxottica. The launch comes shortly after Meta restructured its AI team for the fourth time in six months, rebranding it as the "Meta Superintelligence Lab", after lagging rivals in the space. As part of the latest shake-up, Zuckerberg personally approached dozens of top AI researchers over the summer from rivals such as OpenAI and Google, offering sign-on bonuses of as much as $100mn for them to join an elite AI-focused lab, TBD Lab, within his wider AI outfit. On Wednesday, Meta also launched an updated version of its existing Ray-Ban smart glasses with improved battery life, speakers and video cameras. However, in their live demo, the glasses failed to respond correctly to questions from their wearer. Meta also announced a new tie-up with Oakley for AI glasses "for high-intensity sports" that integrate with fitness apps when paired with a Garmin watch. Meta's new smart glasses are being manufactured by Chinese contract manufacturer Goertek, which has been increasing its hold over the smart glasses industry -- and the company's supply chain -- through a dealmaking spree, according to a Financial Times report on Tuesday. The reliance on a Chinese manufacturer comes despite Zuckerberg's increasing anti-Beijing rhetoric, which has become more pronounced during the second Trump administration.
[9]
I tried Meta Ray-Ban Display glasses, and they offer 2 breakthroughs to take us beyond smartphones
I got to wear the new Meta Ray-Ban Display glasses at Meta Connect 2025 on Wednesday. And while they are a long way from replacing your smartphone, they are good enough to make it clear that smart glasses are a lot better when they have a heads-up display. From what I experienced on Wednesday, we're definitely taking a step toward a world where we spend less time with our heads buried in our smartphones. Also: Meta Connect 2025 live updates: Ray-Ban Display, Oakley Vanguard smart glasses, more In the Meta Connect keynote on Wednesday evening at the company's headquarters, Meta CEO Mark Zuckerberg said we've lost a little something as a society with the proliferation of smartphones. Meta's hope is that smart glasses can help restore some of what we've lost by allowing people to stay in the moment more often. Meta is attempting to do that by releasing its next-generation glasses with a color display and a smart wristband that reads subtle gestures from your writing hand and can do several new things not possible in a pair of smart glasses until now. Zuckerberg was clearly more giddy about the launch of the Meta Ray-Ban Display glasses than anything else announced at Meta Connect 2025. "We've been working on glasses for over 10 years at Meta, and this is one of those special moments," Zuckerberg said. "I'm really proud of this, and I'm really proud of our team for achieving this." Also: I tried the Meta Oakley Vanguard smart glasses, and they're cooler than they look The glasses will retail for $799, which includes both the glasses and the neural wristband. They come in two colors, black and brown, and all pairs have transition lenses. However, because of the embedded heads-up display, these glasses cannot accommodate prescription lenses. They have 18 hours of battery life and are water-resistant. They arrive in stores on September 30, and you'll be able to go and get a demo -- presumably in the same EssilorLuxotica stores such as Sunglass Hut and LensCrafters where Meta Ray-Ban glasses have been such a fixture for demos over the past two years. These Meta Ray-Ban Display glasses have a full-color screen in your right eye, and the display sits slightly off to the side so that it doesn't distract you, and it outputs up to 5,000 nits of brightness so that you can see it whether you're inside or outside. They also have a full operating system to take advantage of the display by allowing you to capture photos and videos, listen to music, invoke live captions during a conversation, get visual feedback for questions you ask AI, engage in WhatsApp messaging hands-free, and use AI to capture notes on important conversations. Also: I tried smart glasses with a built-in display, and they beat my Meta Ray-Bans in key ways The primary way you interact with the new smart glasses is by using a neural wristband that is as much of a breakthrough as the smart glasses themselves. Within 5-10 minutes of using the wristband, I was pinching with my thumb and middle finger to go back, swiping across screens with my thumb, pinching with my forefinger and thumb to select things, and pinching with my forefinger and thumb and twisting my wrist to turn a setting up or down (such as volume). Zuckerberg called the wristband "the world's first mainstream neural interface." The wristband itself is made of a sturdy cloth material and uses a combination of a metal cinch and magnets to fit snugly around your wrist. I found it to be pretty comfortable, and within a few minutes, I didn't think about being on my wrist. The display itself is pretty clear and easy to read, although at times I found myself closing my left eye in order to see the screen more clearly through my right eye. But I especially loved being able to frame photos and videos with the screen before shooting or to adjust the framing during a video recording. You can even use the pinch-and-wrist-twist gesture to zoom up to 3x. You can then preview the photos or videos on the glasses and send them to someone or post them on social media without having to use your phone. Also: Meta Ray-Bans vs. Oakley: I tested both smart glasses, there's a clear winner One of the smartest things I saw the glasses do in my time trying them was the live captioning feature, where I identified the person standing right in front of me as the one I wanted it to focus on, and it transcribed only their words while ignoring the words in conversations of other people in the same room. You could easily imagine being in a noisy place and having the device only pay attention to the person you were talking to and transcribing their words on the screen while drawing out the words of others around you. In fact, this would be super helpful when having an important conversation with someone in a loud room so that you could catch everything they were saying. This is related to a new feature called Conversation Focus, which is also on the audio-only smart glasses. It amplifies the audio of someone you're speaking with directly. Similarly, the Meta Ray-Ban Display glasses seemed to be better at acting only on my voice when using voice commands and not being distracted by the voices of people around me who were also talking. That is a much smarter feature than the standard Meta Ray-Bans and Meta Oakley glasses are currently capable of. Also: Are smart glasses with built-in hearing aids viable? My verdict after months of testing Lastly, one of the features that Meta talked about on stage that I didn't get to try was Live AI. The idea with this feature is that the glasses can see what you see and hear what you hear. You can start a Live AI session when someone is about to give you something important that you need to remember, like instructions, explanations, sharing an important idea, or a message to give to someone else. Then, it can take notes, make a summary, record steps to remember, and generally make sure you get it right without missing important details. I'm looking forward to trying this and the other aspects of the Meta Ray-Ban Display glasses in the weeks and months ahead. You can look forward to more hands-on perspectives, reviews, and breakdowns from ZDNET.
[10]
Hands-on with the Meta Ray-Ban Display glasses
Mark Zuckerberg, chief executive officer of Meta Platforms Inc., wears a pair of Meta Ray-Ban Display AI glasses during the Meta Connect event in Menlo Park, California, US, on Wednesday, Sept. 17, 2025. When it comes to the new $799 Meta Ray-Ban Display glasses, it's the device's accompanying fuzzy, gray wristband that truly dazzles. I was able to try out Meta's next-generation smart glasses that the social media company announced Wednesday at its annual Connect event. These are the first glasses that Meta sells to consumers with a built-in display, marking an important step for the company as it works toward CEO Mark Zuckerberg's vision of having headsets and glasses overtake smartphones as people's preferred form of computing. The display on the new glasses, though, is still quite simplistic. Last year at Connect, Meta unveiled its Orion glasses, which are a prototype capable of overlaying complex 3D visuals onto the physical world. Those glasses were thick, required a computing puck and were built for demo purposes only. The Meta Ray-Ban Display, however, is going on sale to the public, starting in the U.S. on Sept. 30. Though the new glasses include just a small digital display in their right lens, that screen enables unique visual functions, like reading messages, seeing photo previews and reading live captions while having a conversation with someone. Controlling the device requires putting on its EMG sensor wristband that detects the electrical signals generated by a person's body so they can control the glasses via hand gestures. Putting it on was just like strapping on a watch, except for the small, electric jolt I felt when it activated. It wasn't as much of a shock as you feel taking clothes out of the dryer, but it was noticeable. Donning the new glasses was less shocking, until I had them on and saw the little display emerge, just below my right cheek. The display is like a miniaturized smartphone screen but translucent so as to not obscure real-world objects. Despite being a high-resolution display, the icons weren't always clear when contrasted with my real-world field of view, causing the letters to appear a bit murky. These visuals aren't meant to wrap around your head in crystal-clear fidelity, but are there for you to perform simple actions, like activating the glasses' camera and glancing at the songs on Spotify. It's more utility than entertainment.
[11]
Meta Ray-Ban Display hands-on: Discreet and intuitive
I've been testing smart glasses for almost a decade. And in that time, one of the questions I've been asked the most is "oh, but can you see anything in them?" For years, I had to explain that no, glasses like that don't really exist yet. That's no longer the case. And while I've seen a bunch of glasses over the last year that have some kind of display, the Meta Ray-Ban Display glasses feel the closest to fulfilling what so many people envision when they hear the words "smart glasses." To be clear, they don't offer the kind of immersive AR that's possible with Meta's Orion prototype. In fact Meta considers "display AI glasses" to be a totally separate category from AR. The display is only on one lens -- the right -- and its 20-degree field of view is much smaller than the 70 degrees on Orion. That may sound like a big compromise, but it doesn't feel like one. The single display feels much more practical for a pair of glasses you'll want to wear every day. It's meant to be something you can glance at when you need it, not an always-on overlay. The smaller size also means that the display is much sharper, at 42 pixels per degree. This was especially noticeable when I walked outside with the glasses on; images on the display looked even sharper than in indoor light, thanks to automatic brightness features. I also appreciated that you can't see any light from the display when you're looking at someone wearing the glasses. In fact the display is only barely noticeable at all when you at them up close. Having a smaller display also means that the glasses are cheaper, at $799, and that they don't look like the chunky AR glasses we've seen so many times. At 69 grams, they are a bit heavier and thicker than the second-gen Meta Ray-Bans, but not much. As someone who has tried on way too many pairs of thick black smart glasses, I'm glad Meta is offering these in a color besides black. All Wayfarer-style frames look wide on my face but the lighter "sand" color feels a lot more flattering. The Meta Neural Band wristband that comes with the display glasses functions pretty much the same as the band I used on the Orion prototype. It uses sensors to detect the subtle muscle movements on your hand and wrist and can translate that into actions within the glasses' interface. It's hard to describe, but the gestures for navigating the glasses interfaces work surprisingly well. I can see how it could take a little time to get used to the various gestures for navigating between apps, bringing up Meta AI, adjusting the volume and other actions, but they are all fairly intuitive. For example, you use your thumb to swipe along the the top of your index finger, sort of like a D-pad, to move up and down and side to side. And you can raise and lower the speaker volume by holding your thumb and index finger together and rotating your wrist right or left like it's a volume knob. It's no secret that Meta's ultimate goal for its smart glasses is to replace, or almost replace, your phone. That's not possible yet, but having an actual display means you can look at your phone a whole lot less. The display can surface incoming texts, navigation with map previews (for walking directions), and info from your calendar. I was also able to take a video call from the glasses -- unlike Mark Zuckerberg's attempted live demo during his keynote -- and it was way better than I expected. I could not only clearly see the person I was talking to and their surroundings, I could turn on my glasses' camera and see a smaller version of the video from my side. I also got a chance to try the Conversational Focus feature, which allows you to get live captions of the person you're speaking with even in a loud environment that may be hard to hear. There was something very surreal about getting real-time subtitles to a conversation with a person standing directly in front of me. As someone who tries really hard to not look at screens when I'm speaking to people, it almost felt a little wrong. But I can also see how this would be incredibly helpful to people who have trouble hearing or processing conversations. It would also be great for translations, something Meta AI already does very well. I also appreciated that the wristband allows you to invoke Meta AI with a gesture so you don't always have to say "Hey Meta." It's a small change, but I've always felt weird about talking to Meta AI in public. The display also addresses another one of my longtime gripes with the Ray-Ban Meta and Oakley glasses: framing a photo is really difficult. But with a display, you can see a preview of your shot, as well as the photo after the fact, so you no longer have to just snap a bunch and hope for the best. I've only had about 30 minutes with the glasses, so I don't really know how having a display could fit into my daily routine. But even after a short time with them, they really do feel like the beginning of the kind of smart glasses a lot of people have been waiting for.
[12]
Meta unveils Ray-Ban Meta Gen 2 and Oakley Meta Vanguard smart glasses
Serving tech enthusiasts for over 25 years. TechSpot means tech analysis and advice you can trust. What just happened? At Meta Connect 2025, Mark Zuckerberg unveiled three new smart glasses, including the second-generation Ray-Ban Meta, the Oakley Meta Vanguard, and the Meta Ray-Ban Display. Other launches include an EMG (Electromyography) wristband called the Meta Neural Band and a new entertainment hub for the Quest headsets, called Horizon TV. The Ray-Ban Meta Gen 2 is the successor to the original Ray-Ban Meta that Zuckerberg described as the world's most popular smart glasses. It features a 12MP camera supporting 3K Ultra 60fps HDR video recording. The device will also support hyperlapse and slow-motion video capture after an update this fall. The new glasses pack 32GB of built-in storage and offer up to eight hours of usage on a single charge, with an accompanying case providing an additional 48 hours of battery life. Notable features include adaptive volume, a two-way touchpad, a capture button, and voice control. Ray-Ban offers the Gen 2 glasses in its familiar Wayfarer, Skyler, and Headliner frame styles, with multiple color options. Users can also select tinted, transition, or clear lenses. Meta and Ray-Ban accommodate prescriptions ranging from -6.00 to +4.00. The device is available for purchase from Meta.com and Ray-Ban.com, starting at $379. Next up, the Oakley Meta Vanguard is not a direct successor to the original Oakley Meta glasses, but is instead a specialized device designed for athletes. It features the famous Oakley wraparound frame with reflective, swappable lenses in multiple color options. The Vanguard offers a 12MP camera with 3K video capture, a 122-degree field-of-view, and adjustable video stabilization. Meta claims up to nine hours of battery life, or up to six hours of continuous music playback. The accompanying charging case provides an additional 36 hours of charge on the go. Meta says top-tier athletes helped design the Vanguard, and the company rigorously tested it to ensure it holds up to real-world challenges. For example, it has a wind-optimized microphone so athletes can use it while running or jogging. It also offers Garmin and Strava integration, allowing users to ask Meta AI about their heart rate and other data. Athletes training with Strava can also get the Vanguard to graphically overlay their performance metrics onto videos and photos captured with the device and share them online using the Meta AI app. Additionally, users who set up App Connection with Garmin Connect, Apple Health, or Health Connect by Android will get activity summaries directly in the Meta AI app after each workout. The Vanguard is now available for pre-order in several countries, including the US, at Meta.com and Oakley.com. Lastly, Meta announced the Ray-Ban Meta Display, a new pair of AI glasses. They ship with the Meta Neural Band, an EMG wristband that lets users control features with hand movements. Pricing for the bundle starts at $799.
[13]
Meta Unveils Smart Glasses With Apps and an Artificial Intelligence Assistant
Four years ago, when Meta released smart glasses that could take photos and videos, the product became a surprise hit. Since then, millions have snapped them up. Now the company is raising its bets on eyewear. On Wednesday, at its annual developer conference in Menlo Park, Calif., Meta unveiled three new types of smart glasses. One of them, called Meta Ray-Ban Display, has a tiny screen in its lens that can display apps, share media to Instagram and play music from built-in speakers. The apps on the glasses will be controlled by a wristband. It also has a voice-based artificial intelligence assistant that can talk through a speaker and see through a camera. Another pair was an upgraded version of its existing Ray-Bans, while the third was meant for sportswear and is being made by Oakley. Meta Ray-Ban Display will cost $799, about double the price of older versions, and will be released on Sept. 30. At the conference, some of Meta's smart glasses appeared unfinished. Onstage for the unveiling on Wednesday evening, Mark Zuckerberg, the company's chief executive, asked the glasses to provide a barbecue sauce recipe and to call a colleague -- but the devices failed to do so. "They tell us not to do live demos," Mr. Zuckerberg said, while the audience laughed along. He added that "the glasses will be able to see what you see, hear what you hear, and then go off and think about it." Meta has billed its smart glasses and virtual reality headsets as a way to experience the "metaverse," a virtual world that Mr. Zuckerberg has called the future of the internet. But this year, the glasses were aimed more at everyday uses -- reading texts while on a walk, for instance, or a way to get directions to the grocery store. The smart glasses are an extension of Meta's work on artificial intelligence. The company has been pumping billions of dollars into building new data centers and revamping its A.I. division, which is focused on creating "superintelligence," a hypothetical form of A.I. that could be more powerful than the human brain. On Wednesday, Mr. Zuckerberg called glasses "the ideal form of superintelligence." Unlike the company's heavier virtual reality headsets, Meta's smart glasses resemble regular eyeglasses. The cameras, microphones and speakers are barely visible, which gives them a broader appeal than the bulkier devices, said Melissa Otto, the head of research at S&P Global Visible Alpha. "The metaverse was geared very much toward gamers," Ms. Otto said. "But think about how many people wear sunglasses."
[14]
Facebook owner Meta unveils new range of AI-powered smart glasses
The event comes as the Facebook, Instagram and WhatsApp owner faces ongoing scrutiny over the impact of its products, particularly on children. He called the technology a "huge scientific breakthrough" before an audience of hundreds gathered on the company's Silicon Valley campus. The Meta Ray-Ban Display comes with a full-colour high-resolution screen in one lens where users can conduct video calls and see messages. It also features a 12-megapixel camera. Mr Zuckerberg hopes Meta's line of smart accessories will be a key platform for integrating its artificial intelligence tool, Meta AI, into people's lives. Analysts say smart glasses are likely to be more successful than the firm's multi-billion dollar Metaverse project - virtual worlds to connect users across digital environments. "Unlike VR headsets, glasses are an everyday, non-cumbersome form factor," said Forrester VP, Research Director Mike Proulx. But, he added, "the onus is on Meta to convince the vast majority of people who don't own AI glasses that the benefits outweigh the cost." The company said it does not discuss sales information but it is understood to have sold around two million pairs of smart glasses since it entered the market in 2023. The Display model will be available this month and sell for $799 (£586), hundreds of dollars more than Meta's current smart glasses. Leo Gebbie of CCS Insight said he is sceptical that it will gain as much traction as Meta's other smart glass models. "The Ray-Bans have done well because they're easy to use, inconspicuous and relatively affordable," Mr Gebbie said. Meta is currently in the middle of a massive spending spree as it bolsters its AI operations. Mr Zuckerberg said in July that the company would spend hundreds of billions of dollars on building sprawling AI data centres in the US. One of the sites is expected to cover an area that is nearly the size of Manhattan. That AI infrastructure investment is complemented by huge spending on hiring top talent away from rival companies. Meta has said it would develop what it called "superintelligence," AI technology that can out-think human beings.
[15]
We Need to Talk About Smart Glasses
With any new device category comes a whole host of novel and sometimes exhaustingly complex questions. Smartphones, for example, no matter how mundane they seem right now, are still nagging us with existential quandaries. When should we use them? How should we use them? What in God's name happens to us when we use them, which, last I checked, is literally all the time? These are important questions, and most of us, even if we're not spending all day ruminating on them, tackle the complexity in our own way, setting (or resetting) social norms for ourselves and other people as we trudge along. The only thing is, in my experience, we tend to ask these questions mostly in retrospect, which is to say after the cat (or phone, or smartwatch, or earth-shattering portal into the online world) is out of the proverbial bag. It's easy to look back and say, "That was the time we should have thought about this," and when I put Meta's new smart glasses with a screen on, I knew that the time, for smart glasses in particular, was nowâ€"like, right f**king now. In case you missed it, Meta finally unveiled the Meta Ray-Ban Display, which are its first smart glasses with an in-lens display. I flew out to Meta headquarters for its annual Connect conference to try them, and the second I put them on, it was clear: these are going to be big. It probably seems silly from the outside to make a declaration like that. We have screens everywhere all the timeâ€"in our hands, on our wrists, and sometimes, regrettably, in our toasters. Why would smart glasses be any different? On one hand, I get that skepticism, but sometimes function isn't the issue; it's form. And when it comes to smart glasses, there is no other form like it. Meta's Ray-Ban Display aren't just another wearable. The screen inside them opens up an entirely new universe of capabilities. With these smart glasses and Meta's wild new "Neural Band," a wristband that reads the electrical signals in your arm and translates them to inputs, you're able to do a lot of the stuff you normally do on your phone. You can receive and write messages, watch Reels on Instagram, take voice calls and video calls, record video and take pictures, and get turn-by-turn navigation. You can even transcribe conversations that are happening in real time. You're doing this on your face in a way that you've never done it beforeâ€"discreetly and, from my experience, fairly fluidly. If there were any boundaries between you and a device, Meta's Ray-Ban Display are closing them to a gap that only an iPhone Air could slide through. It's incredibly exciting in one way, because I can see Meta's smart glasses being both useful and fun. The ability to swipe through a UI in front of my face by sliding my thumb around like some kind of computer cursor made of meat is wild and, at times, actually thrilling. While not everything works seamlessly yet, the door to smart glasses supremacy feels like it's been swung wide open. You are going to want a pair of these smart glasses whether you know it or not. These are going to be popular, and as a result, potentially problematic. We may have a solid grasp on where and when we're supposed to use phones, but what happens when that "phone" in question becomes perfectly discreet, and the ability to use it becomes almost unnoticeable to those around us? When I use a smartphone, you can see me pick it upâ€"you know there's a device in my hand. When I use Meta's Ray-Ban Display, however, there's almost no indication. Yes, there's a privacy light that tells outside people that a picture or video is being taken, but there's also less than 2% light leakage through the lens, meaning you can't tell when the screen inside the glasses is on. I certainly couldn't tell when I watched others use them. It's as ambient as any ambient computing I've witnessed so far. I talked to Anshel Sag, a principal analyst at Moor Insights & Strategy who covers the wearable market, and he says the privacy framework around technology like this is still in flux. "We are still very much in the infancy of the smart glasses, AI wearable, and AR privacy and etiquette era," he said. "I think that the reality is that having a wearable with a camera on your face is going to change things, and there are going to be places where these things are banned explicitly." Some of those environments, Sag said, are private areas like bathrooms or locker rooms, but it could extend beyond just places where you might catch a glimpse of someone naked. Driving, for example, is a major question. Meta's Ray-Ban Display have navigation built in, and while the company tells me that the feature is designed for walking right now, it's not actually preventing anyone from using its smart glasses in the car. Instead, it will provide a warning before you do so by detecting what speed you're moving at. Other companies like Amazon seem not to have even thought that navigating on smart glasses while driving could be a safety hazard at all. Early reports indicate that Amazon is plowing forward, making smart glasses that are specifically designed for its delivery drivers to use in a car. While regulators like the NHTSA have issued warnings about people using VR headsets while driving (yes, people were actually doing that), it hasn't, according to my research or knowledge, addressed the impact of smart glasses, which are much more likelyâ€"especially if they become widespreadâ€"to enter the equation while driving. I reached out to the NHTSA for comment, but have not yet received a response. Privacy concerns shouldn't just stem from the form factor, either. You also have to think about the company that's making the thing you're wearing on your face all the time and whether it has shown to be a good steward of your data and privacy. In Meta's case? Well, without going into an entirely separate diatribe, I think it could do a lot better. And other companies that are also in hot pursuit of screen-clad glasses, like Google? Well, they haven't been much better. And makers of smart glasses shouldn't be surprised if, when these things wind up on people's faces, they get some shit for it. Google Glass, which came out in 2013, may seem like a different age, and in a lot of ways it is (people's expectations for privacy are almost nonexistent now), but we also haven't had to confront the idea of pervasive camera-clad wearables in a long time, so who's to say things have really changed? Sag says, while he expects some backlash, it may not be like the Glasshole days of yore. "I think there will be some backlash, but I don't think it's gonna be as bad as Google Glass," he says. "Google Glass had such an invasive appearance. You know, it didn't really look normal, so it really caught people's attention more. And I think that's really what has made these classes more successful, is that they're just inherently less intrusive in terms of appearance." I may not be an industry analyst, but I agree with Sag. I'm not sure there really will be a category-ending backlash like we saw back in the Google days, and a part of me doesn't want there to be. As I mentioned, I got a chance to use Meta's Ray-Ban Displays, and the idea all but knocked my socks off. These are the smart glasses that anyone interested in the form factor has been waiting for. What I really want is to be able to live in a world where we can all use them respectfully and responsibly, and one where the companies that are making them give us the same responsibility and respect back. But in my experience, the only way to get toward a more respectful, harmonious world is to try everything else first, and in this case, the first step might be your next pair of Ray-Bans.
[16]
Meta's new $799 smart glasses come with a color AR display... and a wristband
The band is designed to accurately pick up minute gestures for a range of actions The new Meta Ray-Ban Display smart glasses look like a regular ol' pair of shades, but they show you - and only you - a floating screen with useful content. That's already pretty neat, but I'm honestly more excited about the companion wristband they ship with. Meta's already been at this smart glasses thing for a bit, so it knows how to pack a camera, mics, speakers, and AI smarts into a pair of spectacles for interacting with an assistant using voice commands. With the addition of a screen, it can now visually reveal contextual information like your phone would - except you don't have to fish a device out of your pocket. The 600 x 600-pixel color display can show you DMs from apps like WhatsApp and Instagram, let you compose photos and review your shots, caption and translate conversations with people around you, see your contacts on a video call, guide you to your destination with maps, and bring up the weather, your calendar, and even your music or podcast playlist. There's a six-mic array and open-ear speakers, Bluetooth to pair with your preferred wireless audio gear, and a 12-MP camera with 3x zoom on board. The Neural Band is particularly interesting because it opens up a whole lot of ways to use these glasses. Using electromyography (EMG) tech that Meta's been perfecting for years, it can accurately detect a bunch of gestures you make with your fingers via muscle signals at the wrist to navigate through the on-screen interface. That means you can respond to incoming messages, intuitively control settings within apps like the zoom on your camera, and scroll through Instagram Reels with inconspicuous hand gestures like pinching and swiping. Since the band's sensing tech is highly sensitive, it can perceive minute gestures - so they don't require much effort, can be performed even without raising your arm, and won't be plainly obvious to people around you. What's more, it's designed to work with people with mobility challenges who have trouble with large movements. The band was developed with nearly 200,000 research participants to detect gestures with high fidelity, so you don't have to train it to respond specifically to you before using it. Meta also notes that it will soon recognize handwriting too. The glasses themselves feature transition lenses, which means they adapt to changing light conditions around you, going from clear to dark outdoors, and back to clear indoors. The company says you can expect up to 6 hours of continuous mixed use from the glasses on a full charge, and the included foldable carrying case can provide up to 30 total hours of battery life. The band will run for up to 18 hours on a charge, and is IPX7 rated for water resistance. That all sounds pretty nifty, if you're into that promised Google Glass life - or simply don't care to whip out a phone or smartwatch when you need a screen's worth of info. As much as I'm excited to try out this tech, I worry about getting too used to just being plugged into all of today's apps and feeds, and having a harder time disconnecting from them when I mean to. Meta will begin selling these for US$799 on September 30 across the US, and is offering in-person demos and fittings to try out three sizes before you pick one up. The glasses will come to Canada, the UK, France, and Italy early next year.
[17]
Meta launches $499 Oakley smart glasses
MENLO PARK, California, Sept 17 (Reuters) - Meta on Wednesday launched $499 Oakley-branded smart glasses for athletes that come with a centered action camera, louder speakers and better water resistance, expanding its wearables beyond the Ray-Ban line that has been an early AI-era hit. Dubbed Oakley Meta Vanguard, the glasses integrate with Meta's AI app and fitness platforms such as Garmin and Strava to deliver real-time training stats and post-workout summaries. They will come with nine hours of battery life and roll out first in countries such as the U.S. and Canada starting Oct. 21. The device was unveiled at Meta's (META.O), opens new tab annual Connect conference, where the company is also expected to launch its first consumer-ready smart glasses with a built-in display. Those glasses will likely be named Celeste and be paired with Meta's first wristband for hand-gesture controls, but their expected $800 price may dissuade some buyers, analysts said. CNBC has reported the glasses could feature Prada branding. CEO Mark Zuckerberg walked on stage appearing to wear a pair, but put them away immediately. Celeste is expected to include a small digital display in the right lens for basic tasks such as notifications. It will also offer features available on its existing Ray-Ban and Oakley smart glasses such as an AI assistant, cameras, hands-free control and livestreaming to its social media platforms. They mark Meta's latest bid to stay relevant in the AI race, where it trails rivals such as OpenAI and Google in rolling out advanced models. Zuckerberg kicked off a billion-dollar talent war earlier this year to poach engineers from rivals and has promised to spend tens of billions of dollars on AI chips. The launch at Meta's annual Connect conference, held at its Menlo Park, California headquarters, comes amid scrutiny over Meta's handling of child safety on its social media platforms. Reuters reported in August that Meta chatbots engaged children in provocative conversations about sex and race, while whistleblowers said earlier this month researchers were told not to study harms of virtual reality to children. For Zuckerberg, whose massive bet on the metaverse has so far generated tens of billions of dollars in losses, smart glasses are the ideal device for superintelligence - a concept where AI surpasses human intelligence in every possible way. While experts and executives disagree on whether and when superintelligence can be achieved, analysts said Meta's new glasses are a step toward the planned 2027 launch of its "Orion" prototype, unveiled last year and described by Zuckerberg as "the time machine to the future." "It wasn't long ago that consumers were introduced to AI on glasses and in recent quarters brands have also begun to include displays, enabling new use cases," said Jitesh Ubrani, research manager for IDC's Worldwide Mobile Device Trackers. "However, consumer awareness and product availability of AI glasses with display remains limited. This will change as Meta, Google, and others launch products in the next 18 months." IDC forecasts worldwide shipments of augmented reality/ virtual reality headsets and display-less smart glasses will grow 39.2% in 2025 to 14.3 million units, with Meta driving much of the growth thanks to demand for the Ray-Bans it makes with EssilorLuxottica. Reporting by Aditya Soni and Echo Wang in Menlo Park, California; Editing by Sayantani Ghosh and Matthew Lewis Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence * Medtech Echo Wang Thomson Reuters Echo Wang is a correspondent at Reuters covering U.S. equity capital markets, and the intersection of Chinese business in the U.S, breaking news from U.S. crackdown on TikTok and Grindr, to restrictions Chinese companies face in listing in New York. She was the Reuters' Reporter of the Year in 2020.
[18]
Meta's smart glasses were not ready for prime time but could succeed at real life
Meta Connect 2025 started out on Wednesday night with Mark Zuckerberg donning Meta's latest pair of Ray-Ban smart glasses and then switched to "Mark's POV" as we saw his view from the lens display. You could feel the confidence in his step as he walked to deliver the keynote, fist-bumping Diplo along the way. The event ended less than an hour later with Zuckerberg's shoulders sunk after the latest Meta Ray-Bans failed every live demo. From Meta's Menlo Park campus, Zuckerberg revealed three pairs of Meta smart glasses that were already accidentally unveiled on its YouTube channel earlier this week. The irony of AI proving comedy is not dead First up was the latest generation Meta Ray-Ban, still following the iconic Wayfarer design. Zuckerberg said they now have double the battery life and 3k video recording. They're $379 and on sale right now. One innovation is an accessibility-friendly feature called Conversation Focus that will come to all existing Ray-Ban Metas with a software update. Conversation Focus amplifies the voice of whoever the wearer is speaking to in a noisy space. Zuckerberg played a video of street style photographer Johnny Cirillo of Watching New York demonstrating how it works. The preternaturally cool Cirillo strolled up to someone on a busy New York City street and to tune in to their conversation he said, "Hey, Meta, start Conversation Focus" and the words of his companion grew louder as the street sounds faded into the background. There was no such smoothness when things shifted to a live demo of a new Live AI feature. Zuckerberg said Live AI is designed to provide AI assistance in nearly every aspect of the lives of those who wear Meta's Ray-Bans. He did hedge and say that the feature is "not there yet" in that it cannot run all the time, but he said it currently can be used for an hour or two straight. LiveAI demo fails on the first prompt at Meta Connect 2025. #Meta #AI #LiveAI -- Shacknews (@shacknews.com) 2025-09-18T00:34:42.409Z To show off its current capabilities, Zuckerberg chatted with a chef on a screen and asked him to use it to make a Korean steak sauce with the ingredients he had set out before them. "Hey, Meta, start Live AI," the chef said and then asked it to help create the condiment. Live AI drifted off into a description of what is generally in the recipe and the nervous-looking chef bounced from foot to foot, and asked, "What do I do first?" After a very long pause, Live AI said, "You've already combined the base ingredients." The chef tried his request again and Live AI, like a culinary HAL 9000 repeated, "You've already combined the base ingredients." The chef quickly tried blaming the Wi-Fi, a cue Zuckerberg picked up on and repeated over and over again as he grew visibly anxious. Sports saves the day Zuckerberg moved on to more stable territory with a pair of glasses that did not rely on a live demo, the new Oakley Meta Vanguard. The shield-style glasses are designed for performance. Zuckerberg said that someone can run two marathons on them and not be out of battery. The camera is centered for shot alignment, there's a 122-degree field of view, and video capture is 3k with slow-motion and hyperlapse features that will roll out to all of Meta's smart glasses. There's also wind noise reduction for calls so that surfing shouldn't stand in the way of handling meetings. New partnerships with Garmin and Strava let users capture and share with their respective communities. Zuckerberg said the Oakleys are the most water-resistant glasses Meta makes with an IP67 rating. He said he has taken them out surfing. The Oakley Meta Vanguard sells for $499. It can be pre-ordered now and ships on Oct. 21. It's all in the wrist Last up was Meta Ray-Ban Display, the glasses Zuckerberg strutted to the stage in. They have the look of a chunkier Wayfarer and feature new technologies, including a high-resolution display on the right lens and a whole new way to interact with glasses, a wristband called the Meta Neural Band. He said the set is $799 and will be available only in stores starting Sept. 30. Zuckerberg looked proud again as he hyped the glasses display that's large enough to watch a video or a few messages and disappears when not in use. What's on screen is sharp, with 42 pixels per degree, and bright, at 5,000 nits. But Zuckerberg really wanted to show off what he was wearing on his wrist, the Neural Band that he said is "a huge scientific breakthrough" that has the ability to work with what's on the display through barely perceptible movements. He said it has 18 hours of battery life and is water-resistant. And then came the moment Zuckerberg undoubtedly looked forward to for perhaps years but now seemed to dread. "We've got the slides," he said and paused. "And the live demo." The audience cheered the words "live demo" like the prospect of lions in the Colosseum and in a shaky voice, Zuckerberg said, "Let's do it live." The demo kicked off with a notification from Meta Chief Technology Officer Andrew Bosworth. "Boz is messaging me," Zuckerberg said. He then typed back a message with a few subtle taps on a surface and told the audience that he is now up to 30 words a minute. In texts they agree on a video call. A loud ring echoed through the room and Zuckerberg repeatedly tried to answer it. Finally, he said to the audience, "That's too bad. I don't know what happened."He tried again and again. "This is uh. It happens," he said. "Let's go for a fourth." The ringing continued. And continued. "Five times," he said. "I don't know what to tell you guys." Boz then emerged from backstage where they decided to show off how the glasses could essentially put subtitles on in-person conversations. After a glitchy start, the feature kicked in and transcribed everything Zuckerberg was saying onto the display of Boz's glasses. Boz said the glasses could also handle live language translation in text, but there was no demonstration. Zuckerberg then took a few photos of Boz to show how the display feature lets people preview shots before they take them and go through them after. To end things, a pre-taped video of friends bumping into each other played to show how Live AI on Ray-Ban Displays could be used agentically but naturally to follow up on everything they discussed. With Ray-Ban Display, you will never be able to say "let's have coffee" to that person you ran into and have things just end there. All kidding aside While the demos failed, reviews of Meta's smart glasses are out there and they are largely positive. The Verge Senior Reviewer, Wearable Tech Victoria Song titled her review of the Meta Ray-Ban Display, "I regret to inform you that Meta's smart glasses are the best I've ever tried." There is tremendous value for many in the accessibility features that Meta offers with its standard Ray-Ban model and the Display. Conversation Focus and transcription are a huge help for those with hearing impairment or those who have difficulty focusing. And Live AI could presumably assist those with visual impairment in navigating, interpreting, and interacting with the world around them. Meta's smart glasses are currently the most popular of their kind and these latest offerings look like they'll cement that status.
[19]
Meta launches Ray-Ban Display glasses for $799, but you can't buy them online
Meta is launching its first smart glasses with a built-in display, with the new "Meta Ray-Ban Display" glasses launching later this month for $799. Smart glasses have been growing in functionality and popularity in recent years, especially built on the back of Meta's Ray-Ban partnership. The Ray-Ban Meta smart glasses have been a smash hit, spawning a sibling earlier this year in the Oakley Meta smart glasses, and now a sequel. But, those glasses only have a camera, speaker, and a touchpad. Now, Meta is launching its first pair of glasses with a display. That's where the Meta Ray-Ban Display glasses come into the picture. With a familiar, but thicker design, the Meta Ray-Ban Display glasses can show an interface off to the right side of your view that lets you interact with Meta AI, see a camera viewfinder, control music, and both send and receive messages. The "hi-res" 600x600p display is full-color. You can expect 6 hours of "mixed use" battery life, and up to 30 hours with the included charging case. To control it, you'll use Meta's "Neural Band" which can recognize your hand movements - specifically the muscle signals, not just the movements themselves - to control the interface. Meta will even let you "type" with it. The Neural Band has 18 hours of battery life. Features of the Meta Ray-Ban Display glasses will include: Meta explains: Meta Ray-Ban Display glasses are designed to help you look up and stay present. With a quick glance at the in-lens display, you can check messages, preview photos, see translations, get help from Meta AI, and more -- all without needing to pull out your phone. It's technology that keeps you tuned in to the world around you, not distracted from it. This breakthrough category of AI glasses comes with a full-color, high-resolution display that's there when you want it -- and gone when you don't. The display is placed off to the side, so it doesn't obstruct your view. And it isn't on all the time -- it's designed for short interactions that you're always in control of. This isn't about strapping a phone to your face. It's about helping you quickly accomplish some of your everyday tasks without breaking your flow. Meta says Ray-Ban Display glasses will go on sale starting on September 30 - less than two weeks from now. They'll cost $799 for the duo of the glasses and the Neural Band. But there's a catch. While you will be able to go buy them, you can't do so online. Meta will be selling these exclusively in-store starting in the US. You'll be able to find Ray-Ban Display at "limited" stores including Best Buy, LensCrafters, Sunglass Hut, and of course Ray-Ban stores. Select Verizon stores will also start selling the glasses "soon." Along with the ability to purchase a pair, you'll also be able to get a hands-on demo at these locations. Other countries including Canada, France, the UK, and Italy will follow in "early 2026." Meta says that this in-store restriction is to "make sure customers get the glasses and band that's perfect for them," but that buying options will expand "over time." Colors include Black and Sand with transition lenses.
[20]
I went hands-on with the Meta Ray-Ban Display -- 5 features that will actually make them worth $800
Earlier this week, I attended meta Connect and had a chance to go hands-on with the Meta Ray-Ban Display, the newest set of smart glasses from Meta. The defining features is a display in the lenses, and work with a neural wrist strap to translate small gestures into controls to navigate the interface. I only had a few minutes with the Ray-Ban Display, but came away largely impressed. The screen is bright and crisp, and while there's a learning curve, the hand gestures felt very intuitive. However, there's a few things I'd want to try with them first to really see if they're really worth the $800 investment. Here are the five things I'm most interested in trying with the Meta Ray-Ban Display. One of my least favorite things about navigating somewhere unfamiliar with my phone is that I'm constantly looking at its screen to see if I'm going the right direction. While you can use one of the best wireless earbuds or even meta's smart glasses to pass along guidance from your phone, I'm looking forward to having it in the glasses themselves. According to meta, the Ray-Ban Display will launch with pedestrian turn-by-turn walking directions in select cities, and will expand this feature over time. My hope is that, in addition to voice guidance, the glasses will also show arrows or some other visual indicator of which way to turn. This was one of the demo fails at meta Connect, so I can't wait to see how it actually works. If someone calls you via Whatsapp or Messenger, you'll be able to see their face in the Ray-Ban Display's screen, and they'll be able to see what you're looking at via the glasses' camera. (Of course, this will mean that they won't be able to see you, unless you're looking at a mirror.) I can see this feature being handy if you want to, say, livestream your kid's recital to a relative who can't attend, or get help from your mom if you're trying to prepare a meal. Otherwise, your callers might be miffed that they can't see your face. I think one of the best uses for AI on mobile devices is live translation. Having a universal translator at the ready is really helpful when you're traveling for a new country. The Meta Ray-Ban smart glasses have had live translation in them for a while (they are getting more languages via an update), but the Ray-Ban Display takes it a step further, by displaying the translated text in front of you. In addition, they can also be used to help those who have trouble hearing, by showing live captions of the person talking. (It also works for ad-hoc subtitles in movies and TV shows). I tried the live captions in my demo, but really want to see how the other functions work in the real world. In terms of functionality, the Ray-Ban Display's Neural Band is as important as the screen itself. Otherwise, navigating the display would be a really tedious affair. The Neural Band wraps around your wrist, interprets muscle movements you make with your fingers, and translates them into commands to move around the display. In my demo, the things I tried worked pretty well: For example, If I was in the music app, I could pinch my fingers and rotate my wrist to adjust the music volume. I was surprised at how quickly it responded. However, I think the real test will be with typing. When using the Ray-Ban Display's chat apps, you'll be able to tap your fingers on pretty much any surface to type out a message. Zuckerberg claimed he could get up to 30 words per minute, so while you're not going to use these glasses to write the next Great American Novel, it could turn the Displays into a very effective communications platform. I've tried my share of virtual keyboards over the years -- none of which have been great -- so I really want to see how well it performs on the glasses. And, as someone who likes playing the piano, I'm hoping a developer can come up with another sort of virtual keyboard, so I could practice my Beethoven and Brubeck. During his presentation, Zuckerberg stressed that one of the pillars for all of Meta's smart glasses is that they had to be comfortable to wear for long periods of time. More so than meta's other smart glasses, the Ray-Ban Display are meant to be a wearable computer that you use all day long -- or at least until their 6-hour battery needs recharging. To that end, the Displays all come with transition lenses, and you can even get them with prescription lenses. But it also comes down to weight and fit. At 69 grams, they're significantly heavier than the Ray-Ban Meta gen 2 (51 grams), and they have a chunkier frame. By comparison, my favorite non-smart sunglasses, the Smith Lowdown, weigh a scant 45 grams. So, I really want to see if they do feel comfortable for me to wear as long as I would my regular glasses. The Meta Ray-Ban Display go on sale on September 30, and could be on their way to being the best smart glasses around. However, a lot rides on how well all of their new features perform -- not just the ones I highlighted here, but especially Meta AI. I suspect that using the AI on the glasses won't be the primary reason people pick them up; things like voice calling, music, photography and other tasks we primarily use our phones for will be the more popular apps on the Ray-Ban Display. Of course, it will all come dowdn to the comfort of the glasses themselves; no one's going to use them if they can't stand wearing them. We hope to try out the Ray-Ban Display more fully, so rest assured we'll try all these features out -- and more.
[21]
Meta Launches AI Glasses
Meta this week unveiled $800 smart glasses that include an in-lens display. The new Meta Ray-Ban Display AI glasses allow you to check messages, view photos, and interact with Meta AI without the need to use a smartphone. There is a full-color, high-resolution display included in the glasses, along with cameras, microphones, and speakers. Meta says that the included monocular display has a custom light engine and custom module that provides sharp contrast and high brightness. There are 42 pixels in each degree of the field of view. Meta placed the display off to the side to prevent it from obstructing the view through the glasses, and the display is also not designed to be on constantly. It is meant for short interactions. The AI glasses are meant to be used with the Meta Neural Band, a wristband that interprets signals created by muscle activity to navigate the features of the glasses. With the band, you can control the glasses with subtle hand movements, similar to how Apple Vision Pro control works. Meta is offering the glasses in black and sand, and they are designed to look like Ray-Ban Wayfarers, but with a thicker temple arm. There two available sizes, including a standard size and a large size, and the band comes in three sizes. All versions of the glasses include Transition lenses, allowing them to be used both indoors and out. The AI glasses have a six hour battery life, but that can be extended to up to 30 hours with an included charging case. The Neural Band has an 18-hour battery life. Meta says that Meta AI can show wearers answers and step-by-step how tos, with the glasses also able to handle text messaging and video calling. There are camera viewfinder and zoom features for taking photos, along with phone-free walking directions and options for listening to music. Live Captions are available for translating speech in other languages in real-time. The Meta Ray-Ban Display is priced starting at $800, with the Meta Neural Band included. The glasses will be available starting on September 30. Meta is offering in-person demos for those who want to purchase the AI glasses.
[22]
Meta's Ray-Ban Smart Glasses Now Have a Screen and a Magic Wristband
Finally, Meta gave its smart glasses the one thing people want the most. Meta’s Ray-Bans are back with a new generation, and this time they’re finally giving people the one thing they really wantâ€"a screen. At Meta’s annual Connect developer conference, the company officially took the wraps off its Meta Ray-Ban Display, which are, as the name suggests, its first pair of AI-infused smart glasses to come with a full-color in-lens display. The smart glasses, which still bear the same Ray-Ban branding, will cost $799 and are available for preorder today. As you might expect, they can do quite a few things that their predecessor can't, including message notifications, turn-by-turn navigation, and telling you when queries to Meta AI are processing. There are several app integrations, including WhatsApp and Instagram, allowing you to watch reels and make video calls natively in the glasses. One major upgrade on the message notifications front is that the Meta Ray-Ban Display will not be limited to only WhatsApp, meaning it will be able to show notifications on both iOS and Android devices. That’s not the only major shift in this generation. Meta says its first-ever display has a 600 x 600 resolution and 20-degree field of view. The display is monocular, which means it's only in one lensâ€"at the bottom right-ish cornerâ€"and has a refresh rate of 90Hz. Brightness goes up to 5,000 nits and as low as 30 nits, which makes them usable outdoors in full light. One of the coolest parts of the display is that Meta claims that there's less than 2% light leakage, which means that you can't see when someone has their display activated. Speaking of light leakage, all of Meta's Ray-Ban Display smart glasses will come with transition lenses. On one hand, that feels like a weird choice, but it also makes sense since this is a gadget you're going to want to use indoors as well as out, and for $800, you should be able to use them for as long as you like without having to take them off. "As long as you like," in this case, will be no more than 6 hours, according to Meta, which is a lot longer than I was expecting. That solid battery life is thanks in part to what Meta is calling "ultra-narrow steelcan batteries." I wish I knew exactly what that meant, but for now, I can only look forward to getting more of a deep dive in the future. Glasses are only half the appeal of Meta's Ray-Ban Display, though. The other half is its sEMG wristband that you use to control the UI in the glasses. The Meta Neural Band, as Meta is calling it, is arguably the most innovative part of its new Ray-Ban package, since no other product like it exists on a commercial scale. Outside of being a first, it also offers a potential solution to a problem that no other maker of smart glasses has quite solvedâ€"that problem being, how the hell do you actually use smart glasses? While most smart glasses (Meta’s first-gen Ray-Bans included) have a voice assistant for shouting commands like “take a picture†and a fairly simple touch-sensitive bar for physical inputs (i.e., pause/play), neither is ideal in every situation. The fact is, adding a screen complicates smart glassesâ€"the more you can do, the more you’ll need to convey to your glasses, and in order to do that, you need an input system as nuanced as the eyewear itself. Not only that, but if you want to use your smart glasses discreetly (or in a normal fashion at all, really), shouting into a crowded subway car is less than ideal. With the Neural Band, however, you can navigate the UI discreetly by pinching, swiping, and tapping through various menus in the smart glasses. My favorite gesture is a pinch to zoom for taking photos and videos. It's Vision Pro-esque, but all done without cameras. In case you're wondering, yes, the neural band is included in that $800 cost. I got a chance to use Meta’s Ray-Bans and its new Neural Band, and you can read my full impressions here. Like previous iterations of Meta’s Ray-Ban glasses, this year’s edition will also come equipped with cameras and speakers. Camera-wise, Meta is going with a 12-megapixel ultra-wide sensor that can also capture 1080p at 30fps in a 1,440 x 1,920 resolution. There's also a 3x digital zoom. The camera is used for the computer vision in the glasses, aka Meta AI, as well. Despite being a fan of Meta’s Ray-Bans (they’re the only device I ever want to take calls with), Meta AI has been a weak spot for me. While the voice assistant works well most of the time for basic stuff like taking pictures/videos, playing Spotify, and asking what your battery life is, the heavier AI lifting is hit-or-miss at best. Whether Meta’s new smart glasses fix that remains to be seen since I haven't had a chance to use them, but I'm hoping for an upgrade here. But even if AI is still finicky and the cameras and audio are about the same, these smart glasses still have a freaking screen. That’s a big step forward, even if functionality is limited for now. When people ask me about my first-gen Ray-Bans, the first thing they want to know is whether they have a display in them, and they’re inevitably very disappointed when I have to let them down. Now, I’ll actually have something to show them, and if Meta’s wristband works, I’ll even have something to show them that only Meta can provide.
[23]
Despite awkward demos, Meta Ray-Ban Display early testers say it's the real deal
You may have heard that Meta announced some high-tech smart glasses this week. You also may have seen a viral demo video featuring Mark Zuckerberg where the product didn't work at all. Like, at all. The good news is that's apparently not always the case. The two major differentiators between these glasses and the other Meta Ray-Ban specs is that there's now a built-in display, and it comes with a fancy, futuristic wristband that lets you use hand gestures to navigate around apps on said display. Some members of the tech press got hands-on sessions with the new Meta Ray-Ban Display glasses, which retail for a whopping $799. Spoiler alert: Apparently, these smart AR glasses are the real deal. Let's start with Mike Prospero of Tom's Guide. Prospero compared the Ray-Ban Display glasses to other competitors with built-in displays, and one of the bigger things that stood out to him was that the display required no tweaking or tinkering to make it look right. "I've used smart glasses and goggles with built-in displays before, and they've all been so-so: You have to position the glasses just right, or make minute adjustments, or else you can't see anything," Prospero wrote. "Not so with the Ray-Ban Display: The screen was front and center from the moment I put the glasses on my face, no fussin' or mussin' needed." Ray-Ban Display comes with a 600x600, 90Hz display in the right lens with a peak brightness of 5,000 nits, which should make it usable in most conditions. Michael L. Hicks of Android Central said the brightness was adequate, claiming he never had issues with readability or anything like that. Perhaps most importantly, Hicks also said other people can't see what you're doing from the outside looking in. "With Meta Ray-Ban Display, I got up close to Meta's engineer wearing a second pair, looked from various angles, and couldn't see any light leakage," Hicks wrote. "That ensures privacy: you can glance at messages or videos without it being obvious (or rude) to people nearby." It sounds like these were very brief and cursory demo sessions lasting about 15 minutes, so there are lots of features and other aspects of the Ray-Ban Display that will have to wait until the device is out for further analysis. That said, the last thing everyone surely wants to know about is the Neural Band, a wrist-worn control mechanism for the glasses. It uses hand gestures for navigational control, which could be iffy if not done properly. According to Gizmodo's James Pero, the controls feel "like a bit of magic" when they work, but it sounds like it's not perfect just yet. "Personally, I still had some variability on inputs -- you may have to try to input something once or twice before it registers -- but I would say that it works well most of the time (at least much better than you'd expect for a literal first-of-its-kind device)," Pero wrote. "I suspect the experience will only get more fluid over time, though, and even better once you really train yourself to navigate the UI properly." The last thing from these hands-on pieces that's worth mentioning here is live voice transcription for in-person conversations. According to Pero and others, this works well and could eventually be used for live translation across languages in the future. However, Pero had mixed results when it came to the glasses transcribing one person's speech while others were speaking nearby. "While the transcription focused mostly on the person I was looking at, it still picked up stray words here and there," Pero wrote. "This feels like a bit of an inevitability in loud scenarios, but who knows? Maybe beamforming and AI can fill in the gaps." Meta's Ray-Ban Display glasses go on sale on Sept. 30 for $799.
[24]
Meta Unveils First Ray-Ban Smart Glasses with Built-in AR Display
Meta unveiled "Meta Ray-Ban Display," its first pair of AI smart glasses with a built-in augmented reality (AR) display, at the company's annual developer event. Meta Connect 2024 -- the company's developer conference -- took place at its headquarters in Menlo Park, California, on Wednesday. The event featured a keynote from Meta CEO Mark Zuckerberg, where he announced several new smart glasses products. He also introduced a neural wristband designed to work with the new Meta Ray-Ban Display glasses, enabling users to complete tasks such as sending messages with small hand gestures. "Glasses are the only form factor where you can let AI see what you see, hear what you hear, and eventually generate what you want to generate, such as images or video," Zuckerberg says during the keynote. CEO Mark Zuckerberg announced the new product, called Meta Ray-Ban Display, onstage -- which will launch on September 30 for $799. They are equipped with a 12-megapixel camera and a full-colour high-resolution display built into one lens, allowing users to view calls and messages. The new glasses have a small digital display screen in the right lens for basic tasks such as notifications, messages, and Meta AI prompts. The display also functions as a viewfinder when taking photos. The Meta Ray-Ban Display glasses include built-in audio, up to six hours of battery life, and come bundled with the new neural wristband, which translates hand gestures into commands such as replying to texts or answering calls. Zuckerberg also announced the Oakley Meta Vanguard smart glasses, designed for runners, cyclists, and other athletes. These glasses will be released on October 21 for $499. Unlike earlier Meta models with two corner-mounted cameras, the Vanguard has a single wide front lens. It can record video in up to 3K resolution and features a 12-megapixel camera with a 122-degree wide-angle lens. The Oakley Meta Vanguard also includes a programmable button that can trigger custom AI commands through the Meta AI app. All buttons are positioned underneath the frame to ensure the glasses can be worn comfortably with helmets. The device offers up to nine hours of battery life, or six hours of continuous music playback, and can be charged to 50% in about 20 minutes using the included charging case. While these products will available in the coming weeks, it will still be several years until Zuckerberg releases Meta's first pair of AR glasses "Orion." Zuckerberg called the prototype "the most advanced glasses the world has ever seen" are anticipated as the culmination of the Meta CEO's longstanding vision for the "metaverse."
[25]
Meta's new $800 smart glasses with display likely to take center stage at Connect event
MENLO PARK, California, Sept 17 (Reuters) - Meta (META.O), opens new tab is expected to launch on Wednesday its first consumer-ready smart glasses with a built-in display, seeking to extend the momentum of its Ray-Ban line, an early consumer hit of the artificial intelligence era. The new glasses will likely be named "Celeste" and paired with Meta's first wristband for hand-gesture controls, though their expected $800 price tag may dissuade some buyers, analysts said. CNBC has reported the glasses could feature Prada branding. The event began at 5 p.m. PT (8 p.m. ET). The glasses are expected to include a small digital display in the right lens for basic tasks such as notifications, along with features available on its existing Ray-Ban and Oakley smart glasses such as an AI assistant, cameras, hands-free control and livestreaming to its social media platforms including Instagram. While Meta has been at the forefront of developing smart glasses, it trails rivals such as OpenAI and Alphabet's (GOOGL.O), opens new tab Google in rolling out advanced AI models. CEO Mark Zuckerberg has kicked off a billion-dollar talent war to poach engineers from rivals and promised to spend tens of billions of dollars on AI chips. The launch at Meta's annual Connect conference, held at its Menlo Park, California, headquarters, comes amid scrutiny over Meta's handling of child safety on its social media platforms. Reuters reported in August that Meta chatbots engaged children in provocative conversations about sex and race, while whistleblowers said earlier this month that researchers were told not to study the harmful effects of virtual reality on children. For Zuckerberg, whose massive bet on the metaverse has so far resulted in losses of tens of billions of dollars, smart glasses are the ideal device for superintelligence - a concept where AI surpasses human intelligence in every possible way - because they serve as a personal, always-on interface that can see, hear, and interact with the world through the user's perspective. Analysts have said Meta's new glasses are a step toward the planned 2027 launch of its "Orion" augmented virtual reality smart glasses, whose prototype was unveiled last year. Zuckerberg described the product as "the time machine to the future." "It wasn't long ago that consumers were introduced to AI on glasses and in recent quarters brands have also begun to include displays, enabling new use cases," said Jitesh Ubrani, research manager for IDC's Worldwide Mobile Device Trackers. "However, consumer awareness and product availability of AI glasses with display remains limited. This will change as Meta, Google, and others launch products in the next 18 months." Google said in May it had partnered with Warby Parker (WRBY.N), opens new tab to develop an AI-powered eyewear line, with the first products expected after 2025. As part of the deal, the tech giant committed to an equity investment of up to $150 million in the eyewear brand. IDC forecasts worldwide shipments of augmented reality and virtual reality headsets, and display-less smart glasses, will increase by 39.2% in 2025 to 14.3 million units, with Meta driving much of the growth thanks to demand for the Ray-Bans it makes with EssilorLuxottica (ESLX.PA), opens new tab. Reporting by Aditya Soni and Echo Wang in Menlo Park, California; Editing by Sayantani Ghosh and Matthew Lewis Our Standards: The Thomson Reuters Trust Principles., opens new tab * Suggested Topics: * Artificial Intelligence * Medtech Echo Wang Thomson Reuters Echo Wang is a correspondent at Reuters covering U.S. equity capital markets, and the intersection of Chinese business in the U.S, breaking news from U.S. crackdown on TikTok and Grindr, to restrictions Chinese companies face in listing in New York. She was the Reuters' Reporter of the Year in 2020.
[26]
Meta Ray-Ban Display glasses sell personal superintelligence. I'll skip
A great idea on the surface, until you really see the machine it's feeding. Earlier this week, Meta launched a product that tech pundits and evangelists are calling the future of personal computing. Mark Zuckerberg also proudly claimed that "glasses are the ideal form for personal superintelligence." All this chatter was about the new Meta Ray-Ban Display, a pair of smart glasses with a built-in screen on the right lens. It's like the Google Glass, once again, but in an era where smart glasses are not really a social taboo and the tech stack is far more mature. These glasses can let you take video calls, see translation, attend voice calls, view step-by-step map navigation cues, and engage in messaging. Recommended Videos And then there's the Meta AI integration. You see a bunch of flowers and don't know which one is an exotic purple lily. Just summon Meta AI, it will look the world in front of you through the onboard camera, and highlight the right flower. The AI can make sense of text, pictures, and videos fed through the onboard camera. In short, it's multi-modal AI. A good start, at best All of it sounds like a phone that sits on your face. And while at it, the smart glasses don't look dorky. It's the quintessential Ray-Ban style, just a bit thicker. They cost $799, exactly the same price you pay for the excellent iPhone 17. So, should you ditch the phone and get them? Contrary to what you might see in the "future is here" marketing videos, the answer is still no. A lot of the core communication features that unfold on the built-in display of these glasses are fundamentally tethered to the phone. You can't set up a WhatsApp account from scratch on the Meta smart glasses. Or Instagram and Messenger, for that matter. They can't take a voice call unless there's a phone with a SIM and carrier network set up on the device. The phone is still very much a bridge you can't forego. At the same time, these glasses are inching pretty close to being a standalone personal communication device of their own. They've gone closer to that vaunted goal more than any other product out there, AI-powered or otherwise. But what I can't quite ignore is the chatter about "personal superintelligence," and its tangible risks. The hype of personal superintelligence I was pretty surprised by the term when Mark Zuckerberg broached the topic of personal superintelligence on stage. It won't be the first time for the company, however. Meta's press material claims that the goal is in sight, and that "glasses will bring personal superintelligence into our lives." Back in July, Meta published a brief titled "super intelligence," and it brings up the idea of personal devices such as smart glasses. "I think personal devices like glasses that can see what we see, hear what we hear, and interact with us throughout the day are going to become our main computing devices," Zuckerberg said in a video about a month ago. At the same time, the company has also repeatedly violated its commitment to user privacy and been slapped with massive fines worth billions of dollars over the years. It's pretty surreal that in the same month Meta published its personal superintelligence notice, the company also settled a privacy case after agreeing to pay $8 billion in a Delaware court. That vision is pretty scary. Remember, Meta is the same company that has historically hoarded the biggest cache of our personal data through social media sites such as Instagram and Facebook. The company's Pixel network further tracks our activities across the web. Feeding your life to a bad machine Now, imagine a live feed of your life captured by the Ray-Ban Display glasses' onboard camera, your social media interactions on the glasses, and your conversations with Meta AI, falling into the company's lap. History says it's a bad idea. In the age of AI, things are going to get worse. Check out this line from a Meta announcement in April this year, which should tell you what's at stake with the concept of "personal superintelligence" driven by Meta's AI glasses: "Today, we're announcing our plans to train AI at Meta using public content -- like public posts and comments -- shared by adults on our products in the EU. People's interactions with Meta AI - like questions and queries - will also be used to train and improve our models." Except for the contents of your private messages, Meta can use for AI training what is "publicly available online and licensed information," "posts or photos and their captions" shared on social media, and even personal information such as phone numbers that Meta has obtained elsewhere from the internet or its licensed data providers. In fact, even if you (or your information) appears in content shared by another person, Meta will use that data, as well. "Even if you don't use our Products or have an account, we may still process information about you to develop and improve AI at Meta," says the company. In a nutshell, Meta has already hoarded your personal data. And whatever comes out in the future is also kosher for AI training. The camera and display-equipped glasses just act as another channel for feeding text, audio, and visual data to the "AI training engine" at Meta. The big AI problem That's not where the problems end, unfortunately. Meta AI, in itself, should give you plenty of reason to stay on the fence. A recent safety research highlighted that Meta's AI chatbot -- which is available across Instagram, WhatsApp, and Facebook -- instructed "teen accounts on suicide, self-harm and eating disorders." As per internal documents seen by Reuters, policies governing Meta AI allowed the chatbot to "engage a child in conversations that are romantic or sensual," create fake medical information, and tell users that Black people are "dumber than white people." Last year, soon after Meta announced that it was putting Meta AI across its social platforms, experts warned that the chatbot could worsen existing problems such as extremist content, hate speech, and harmful misinformation. Serving Meta AI on something like the Ray-Ban Display glasses, and that too, without full-fledged teen safety and parental controls, is like a ticking time bomb of "personal superintelligence problems." How do parents even regulate or monitor the usage of Meta AI on smart glasses? And let's not forget the inherent problems of generative AI chatbots, such as hallucinations, biases, manipulation, social withdrawal, and privacy scares, to name a few. Overall, Meta's ambitions with AI smart glasses may be benign, but the product is sitting on a pile of dangerous rubble, instead of rock-solid foundations. I am not sure if I will ever feel easy making the Ray-Ban Display a part of my daily routine despite its practical benefits. I will be even more wary if I see my younger siblings, friends, or family members wearing them, especially with their historically lackadaisical approach to privacy and digital harms.
[27]
Meta Ray-Ban Display hands-on: This is the future
Why you can trust Tom's Guide Our writers and editors spend hours analyzing and reviewing products, services, and apps to help find what's best for you. Find out more about how we test, analyze, and rate. The Meta Ray-Ban Display could mark a real inflection point for smart glasses. These new specs not only have a built in display, but also use clever gestures via a wristband to control the device. Combined, they greatly expand the capabilities of what an AI-powered pair of smart glasses can do, and will the the product by which its competitors will be measured. I went hands-on with the Meta Ray-Ban Display smart glasses at meta connect, and even with just a brief demo, I think these are one of the most innovative products I've seen in a while. The Meta Ray-Ban Display will go on sale September 30 for $799. You'll be able to get them in black or sand; all models will feature transition lenses, and you can also get them fitted with prescription lenses. The glasses will come in two sizes, standard and large. According to meta's site, you'll also need to schedule a time to get your wrist measured for the neural band, to ensure a snug fit. The Ray-Ban Displays look like chunkier versions of the Ray-Ban Meta gen 2 glasses; the frames and arms themselves are a little thicker, and they're a bit heavier. I definitely noticed the difference after wearing the gen 2's around most of the day. As with Meta's other Ray-Bans, the Display's camera is in the right corner of the frame, with an action button and touch controls along the right arm. Since the Displays are designed to be worn all day, all models come with transition lenses. However, only two frame colors (black and sand) will be available at launch. However, the Displays do come with a more compact charging case. When not in use, it folds down nearly flat, which will make it far easier to carry in your pocket. I've used smart glasses and goggles with built-in displays before, and they've all been so-so: You have to keep the glasses on just so, or make minute adjustments, or else you can't see anything. Not so with the Ray-Ban Display: The screen was front and center from the moment I put the glasses on my face, no fussin' or mussin' needed. The screen is on the right side, and sits just below your field of vision, but not so much that you have to look all that far down to see it. At the same time, it's not so big as to be overly distracting, and Meta designed it so that it will disappear after a few moments if you're not actively using it. With a resolution of 600 x 600 pixels, it's quite sharp, and you can increase its brightness to 5000 nits; I wasn't able to test it in broad daylight, so that'll have to wait until our full review. Central to the Ray-Ban Display is the Neural Band, a small wrist strap that monitors your wrist movements, and translates that to inputs for the display. The band is made of cloth, and looks kind of like a Whoop strap. It has a slight bulge along the top, but it felt pretty comfortable to wear. There's a little bit of a learning curve to getting the gesture controls right, but after about five minutes, I was making the correct movements about 80 percent of the time. To navigate the display left and right, you slide your thumb along the top of your index finger, and move it up and down to move the controls vertically. Tapping your index finger to your thumb activates some controls, while tapping your thumb and middle finger hides or reveals the display. You can also pinch and rotate you fingers to control volume or zoom on the camera. Once I got the hang of it, it seemed almost magical. The Ray-Ban Display is outfitted with a 12MP camera that can take photos up to 3024x4032 pixels, and 1440p videos at 30 fps; it's sort of in-between the Meta Ray-Ban gen 2 and Oakley Vanguards. One of the neatest uses for the display is as a preview window for the camera, which made it really easy for me to frame photos. Even cooler is that you can rotate your fingers to zoom in and out (the camera has a 3X zoom). It worked flawlessly. These wouldn't be smart glasses without AI, right? Just like meta's other smart glasses, you can point the Displays at something and ask what it is, but instead of getting just a voice overview, the glasses will also show some more information on-screen. For example, I looked at a poster of the Golden Gate Bridge, and asked Meta AI what I was looking at. It not only correctly identified San Francisco's most famous landmark, but also showed some more information about the bridge on the screen, with two suggested prompts below. Having trouble hearing someone in a crowded room? One of the new features of meta's smart glasses is the ability to boost the conversation of a person you're talking to, but the Ray-Ban Display adds an extra level with voice captions. When I was wearing the glasses, it correctly transcribed what a person in front of me was saying, with about a split-second delay. It was also able to ignore someone else talking in the background -- until they got a bit too loud. I'm looking forward to testing the live translation feature of the glasses. I used the Meta Ray-Ban Display for around 15 minutes, but there was so much more I wanted to try with them. I'll have to wait a few weeks for that, though. The combination of the display and gesture controls really expand what you can do with smart glasses; these feel like the great-great-grandchild of Google Glasses, but so much more. The real test with these glasses will be if I can stand wearing them all day; will they be comfortable and useful enough to become an accessory that's not just convenient, but vital? That depends as much on their design as their interface and AI, but I'm impressed so far.
[28]
Everything to know about Meta's latest AI smart glasses
Superintelligence in Meta's AI smart glasses isn't quite here yet. But the tech giant says it's getting there. Meta has unveiled its latest Ray-Ban artificial intelligence-powered smart glasses that have all the upgrades, but most interestingly, have a tiny display to watch videos and see texts, as well as a wristband that controls the glasses. Here are all the latest releases from the company unveiled at its Meta Connect 2025 event overnight. An alternative to the smartphone? The AI smart glasses now have a tiny display and a neural wristband that picks up on subtle hand gestures, which allows users to control the device with their hand movements. Meta CEO Mark Zuckerberg said that users can control the wristband with "barely perceptible movements". The wristband is called the Meta Neural Band. "Glasses are the only form factor where you can let AI see what you see, hear what you hear" and eventually generate what you want to generate, such as images or video, he added, speaking at the tech giant's Menlo Park, California, headquarters. The so-called Meta Ray-Ban Display glasses will be available Sept. 30 and cost $799 in the United States. The new glasses allow users to watch videos through the display and even see and respond to text messages, Zuckerberg said. But the display doesn't block a person's view and disappears when it's not being used. The new smart glasses are a bridge between the company's audio-only Ray-Ban Meta smart glasses and the experimental Orion augmented reality glasses that the company revealed at last year's Connect event. Orion can overlay 3D visuals over a person's real-world field of view with the help of a wireless computing puck, but the glasses are expensive to make and not yet available to consumers. Orion and 'superintelligence' Meta teased a prototype for Orion, which Zuckerberg called "the most advanced glasses the world has ever seen," last year -- but these holographic augmented reality glasses are still years away from reaching the market. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress". While he said that developing superintelligence is now "in sight," he did not detail how this will be achieved or exactly what "superintelligence" means. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence (AGI). Zuckerberg has said he believes AI glasses are going to be "the main way we integrate superintelligence". Upgrades to the old Meta also updated its original, display-less Ray-Ban glasses to have a better battery life. Meta says the battery lasts eight hours with typical use, nearly twice as long as the previous model. There will also be a feature called "conversation focus," which will amplify the voice of the person the user is speaking to and help drown out background noise. This will also be available on the older version of the glasses as a software update, Zuckerberg said. There are also new languages added to the live translation feature, including German and Portuguese. The new model costs $379, and the previous model now costs $299. The Oakley Meta smart glasses Meta also unveiled a new set of AI-powered glasses for athletes, called the Oakley Meta Vanguard, which the company says is specifically for "high-intensity sports" and can be integrated with Garmin devices to give users feedback about their workouts, such as their heart rate and stats. For instance, a runner could ask "Hey Meta, what's my heart rate?" and get a voice response through the glasses. It also auto-captures video clips when the user hits key milestones or ramps up their heart rate, speed, or elevation. The glasses will cost $499 and go on sale Oct. 21. While the company has not disclosed sales figures of the glasses, it said they have been more popular than expected. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Forrester analyst Thomas Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now".
[29]
Meta expands AI glasses line in a bet on the future
Menlo Park (United States) (AFP) - Meta showed off new smart glasses on Wednesday as it continued to bank on a lifestyle shift toward blending reality and virtual space despite the efforts inflicting heavy financial losses. Announcements included the debut of Meta Ray-Ban Display smart glasses that have built-in screens that allow wearers to see messages, photos and more as though looking at a smartphone screen. Billed as Meta's most advance AI glasses, Ray-Ban Display comes with sensor-packed bracelets called neural bands that let people control the eyewear with subtle finger movements, and are priced at $799. "Our goal is to build great-looking glasses that deliver personal superintelligence and a feeling of presence using realistic holograms," Meta chief executive Mark Zuckerberg said as he showed off new AI glasses at the tech firm's annual developers conference. "These ideas combined are what we call the metaverse." Facebook-parent Meta's chief has predicted that AI-infused smart glasses will be the "next major computing platform," eventually providing access to "superintelligence" and replacing the smartphone. The tech titan began investing heavily in virtual reality and the metaverse about four years ago, with Zuckerberg changing the company's name from Facebook to Meta in late 2021 to reflect the strategy change. But Reality Labs -- Meta's virtual and augmented reality unit -- has consistently posted big losses. The unit lost $4.5 billion in the second quarter of this year on revenue of just $370 million, highlighting ongoing challenges in the metaverse business. "There's no realistic chance that smart glasses sales make this division profitable in the short term," CCS Insight principal analyst Leo Gebbie said of Reality Labs while at the Meta event. "Instead, this is about playing the long-term game to break free from smartphones, where Meta has been throttled by rivals Apple and Google, and to control its own destiny in wearables." Smart glasses have seemed on the horizon for more than a decade, when Google's Glass headset and camera released in 2013 -- although it has since been discontinued. Meta has encountered more success with its frames developed alongside Ray-Ban, offering features including a built-in camera, music playback and voice interactions with the company's AI. The global smart glasses market was estimated at nearly $2 billion last year and is projected to reach $8.26 billion annually by the end of the decade, according to analytics firm Grand View Research.
[30]
Mark Zuckerberg unveils Meta's 'AI glasses,' fails demos
Meta CEO Mark Zuckerberg on Wednesday unveiled what he called the company's "first AI glasses with high resolution" -- the Meta Ray-Ban Display, coming September 30 for $799. But the unveiling didn't quite go as Zuckerberg hoped. Zuckerberg's MetaConnect 2025 keynote, held at Meta's California headquarters at the unusually late hour of 5 p.m. Pacific (8 p.m. Eastern), was expected to reveal a groundbreaking pair of smart glasses, codenamed Hypernova. What we got: An upgrade to the preexisting Ray-Ban Meta frames; a new sports-focused set of Oakleys, the Meta Vanguard; and the new model, confusingly called Meta Ray-Bans. "This is one of those special moments where we get to show you something we've poured our lives into," Zuckerberg told a packed house and a livestream with 4,000 viewers. The Meta Ray-Bans had a bright, crisp display rated at an impressive 5,000 nits, he said. Then Zuckerberg revealed not just the Meta Ray-Bans he walked in with (and quickly stashed), but also a companion device called the Meta Neural Band, a light fabric wristband that picks up on small movements in the wrist. This allows you to enter words on the smart glasses display by pretending to handwrite. "I'm up to about 30 words a minute on this," Zuckerberg said. And then the CEO stood helpless as a repeated WhatsApp video call from Meta CTO Andrew "Boz" Bosworth appeared on his glasses. Zuckerberg's Neural Band interface was apparently unable to pick up the call; Boz had to join him live on stage. Zuckerberg's demo game had started strong; the keynote opened with a live view through his Meta Ray-Bans, showing Zuckerberg as he fired up a hype song (the Neural Band also allows for volume control) and replied to incoming texts with a muscle-arm emoji. But then a live demo of the new Ray-Ban Metas (available now for $379) ran aground on the glasses' "LiveAI feature," which was supposed to be instructing one presenter on how to make a sauce with all of the ingredients in front of him. "Now that you've made your base ..." the glasses began several times, ignoring the presenter's repeated request for instructions on how to make that base: "What do I do first?" Zuckerberg later blamed that demo failure on the WiFi, but he was unable to explain why his Meta Ray-Bans could not pick up Boz's call. Finally, a non-live non-demo video purported to show the Meta Ray-Bans being used to design a surfboard and order parts. Zuckerberg explained this was how the glasses would work with agentic AI, brushing past any concerns about whether agentic AI is a thing that works at all -- in live demos or otherwise.
[31]
Meta announces new smart glasses with a bold claim. I've tried them - and am unconvinced
Meta has announced smart glasses with a screen in the right lens, meaning you can read WhatsApp messages, look at a map or translate a conversation - all from the comfort of your face. The company describes them as the world's most advanced AI glasses, and it's the first time it has put a display in its smart Ray-Bans. Mark Zuckerberg believes such hi-tech specs are the future of portable computing, telling the unveiling event they're "the only form factor where you can let AI see what you see, hear what you hear". The display is controlled using a neural band that wraps around the user's wrist and monitors their hand movements. A twist of the fingers will turn the glasses' volume up or down or zoom on the camera; two taps of the thumb to the forefinger will close the display and soon, users will be able to write texts by drawing letters in the air. "The amount of signals the band can detect is incredible - it has the fidelity to measure movement even before it's visually perceptible," said a Meta spokesperson. The company says their glasses are "designed to help you look up and stay present", a "technology that keeps you tuned in to the world around you, not distracted from it". But I tried the tech on at an event with Meta earlier this month and found the opposite. I was so distracted by the display that during an interview with Ankit Brahmbhatt, director of product management at Meta, I realised I was watching a game I'd accidentally left on in my lens. I confessed and asked Mr Brahmbhatt if the glasses would actually help enable better face-to-face conversations, or if people would simply look more engaged because they weren't looking down at their phone. "I don't think we're saying we have all the answers yet, right? Just like when we were first introduced to smartphones or any other new paradigm, there's a lot of things that evolve over time," said Mr Brahmbhatt. "We feel very much that this is already going to make you much more heads up and in the moment. With AI glasses, you actually have the sense of being able to engage." But having a screen in your vision can be very distracting - research has repeatedly shown our brains simply aren't designed to cope with two activities at a time. In one famous study published in the Applied Cognitive Psychology journal, people walking down a road while doing a task on their phone didn't even notice a unicycling clown riding in front of them. Read more from Sky News: How most people are using ChatGPT NHS medicines bill 'should rise to preserve UK drug industry' It's a phenomenon called inattentional blindness. "People are cognitively distracted all of the time [by devices]. It's just that in particular circumstances, such as driving, the risks are so much higher," said Professor Gemma Briggs, professor of applied cognitive psychology at the Open University. Although the display on Meta's glasses will turn off automatically when it detects you're driving, there's nothing to stop users from switching it back on. This could lead to dangerously distracted drivers, according to Professor Briggs. "My research has demonstrated that it really doesn't make any difference whether you're touching, holding, manipulating your phone or whether it's hands-free, you're still far more likely to be distracted. "That means you're four times more likely to be involved in a collision, you are significantly less likely to notice hazards that occur, even if they occur straight ahead of you and any hazards that you do notice, you will take a lot longer to react to," said Professor Briggs. Meta insists these glasses will make sure their wearers stay in the moment and engaged. For some, however, phones are already distracting enough - let alone when you have one strapped to your face.
[32]
Meta unveils Ray-Ban Meta Display smart glasses with augmented reality at Meta Connect 2025
Meta combines cameras, AI, and neural gesture controls across three models to deliver messaging, translation, navigation, and performance tracking directly from eyewear. Meta announced three new pairs of AI smart glasses at Meta Connect 2025 in Menlo Park, California, on Wednesday. The lineup includes the groundbreaking Meta Ray-Ban Display with a built-in augmented reality display, the Oakley Meta Vanguard sports glasses, and an updated Ray-Ban Meta AI glasses generation 2. These devices integrate cameras, audio features, and Meta AI functionalities to enhance user interactions during daily activities and sports. The Meta Ray-Ban Display represents the first smart glasses from a mainstream brand to feature a heads-up display since Google Glass launched years ago. The glasses cost $799 and contain a small digital display that can be controlled via hand gestures through a wristband powered by neural technology. This model adopts classic Wayfarer-like styling that conceals technological components while incorporating a camera, speakers, and microphone. The design ensures Ray-Ban Meta smart glasses blend into everyday wear without drawing attention to their smart capabilities. Inside the right lens, a small, bright, and crisp color display projects content that floats just below the wearer's eye line. This display handles text, images, and live video calls, activating only during use while remaining invisible from external view. Meta CEO Mark Zuckerberg presented the Meta Ray-Ban Display smart glasses during the Meta Connect event. He explained the unique advantages of glasses as a wearable form factor, stating, "Glasses are the only form factor where you can let AI see what you see, hear what you hear," and eventually generate content like images or video. The demonstration encountered technical issues, which Zuckerberg linked to the event's Wi-Fi connectivity, highlighting the reliance on stable network conditions for seamless performance. Interaction with the Meta Ray-Ban Display occurs through multiple input methods building on elements from prior Ray-Ban Meta models. A touch panel along the arms allows direct tapping and swiping gestures, while voice commands provide hands-free control. The accompanying water-resistant Neural Band bracelet functions like a screenless smartwatch, detecting electrical impulses from the forearm to interpret hand gestures. Supported gestures include pinches for selections, swipes for navigation, taps for confirmations, rotations for scrolling, and a virtual d-pad operated by the thumb. An upcoming software update will add finger-handwriting recognition, allowing users to input text by tracing letters in the air. The Meta Ray-Ban Display smart glasses pair via Bluetooth with Android or iPhone devices, ensuring broad smartphone compatibility. Integration with Meta applications like WhatsApp, Messenger, and Instagram facilitates messaging and video calling directly through the display. Practical utilities include real-time assistance features: the system generates live captions or translations for ongoing conversations, delivers turn-by-turn walking directions overlaid on the display, and provides intuitive music playback controls. The glasses integrate an in-lens screen that allows users to check notifications, view translations, access Meta AI, preview images, and more. Meta introduced the Oakley Meta Vanguard, display-free smart glasses tailored for sports enthusiasts. This model draws inspiration from Oakley's Radar and M-Frame designs, featuring wrap-around frames for enhanced peripheral vision during physical exertion. The Oakley smart glasses weigh 66 grams with swappable lenses for different lighting conditions, water resistance against sweat and rain, and nine hours of battery life per charge. Replaceable nose pads ensure secure fit during intense movements like running or cycling. Meta collaborated with Garmin to integrate Oakley Meta Vanguard with Garmin watches and bike computers. This partnership enables voice-activated requests for performance metrics including current speed, pace, heart rate, or distance covered. The camera incorporates automated recording features, triggering video clips upon hitting predefined milestones like completing kilometers or reaching specific speeds. The system compiles clips into highlight reels with overlay data for direct sharing to Strava. Pricing stands at £499 (€549/$499), with shipments beginning October 21. Sports lovers can set up their Oakley Meta glasses following these steps: Meta also introduced the Ray-Ban Meta Gen 2 with improved battery and upgraded camera. This second-generation version doubles battery life compared to the first generation, extending usage time for prolonged activities. The video camera receives higher resolution upgrade, improving clarity for captured footage. Ray-Ban Meta AI glasses Gen 2 feature advanced AI, enhanced capture capabilities, and more battery life. These glasses are available for purchase at £379 (€419/$379), offering an accessible entry point into Meta's smart eyewear portfolio without the augmented reality display. The three new Meta smart glasses models position the company as a leader in wearable AI technology, offering options from basic AI assistance to advanced augmented reality display capabilities across different price points and use cases.
[33]
Meta's Smart Glasses With Built-In Display Are Reaching for Superintelligence
Meta Platforms on Wednesday launched its first consumer-ready smart glasses with a built-in display, seeking to extend the momentum of its Ray-Ban line, one of the early consumer hits of the artificial intelligence era. CEO Mark Zuckerberg showed off the Meta Ray-Ban Display and a new wristband controller, receiving applause at Meta's Connect event despite some demo problems. Meta has tasted success with its smart glasses, and Zuckerberg described them as the perfect way for humans to reach for the AI promise of superintelligence. "Glasses are the ideal form factor for personal superintelligence, because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more," Zuckerberg said. The new Display glasses have a small digital display in the right lens for basic tasks such as notifications. They will start at $799 and be available on September 30 in stores. Included in the price is a wristband that translates hand gestures into commands such as responding to texts and calls. The launch at Meta's annual Connect conference for developers, held at its Menlo Park, California, headquarters, is its latest attempt to catch up in the high-stakes AI race. While the social media giant has been at the forefront of developing smart glasses, it trails rivals such as OpenAI and Alphabet's Google in rolling out advanced AI models. Zuckerberg has kicked off a Silicon Valley talent war to poach engineers from rivals and promised to spend tens of billions of dollars on cutting-edge AI chips. The new glasses come as Meta is facing scrutiny over its handling of child safety on its social media platforms. Reuters reported in August that Meta chatbots engaged children in provocative conversations about sex and race, while whistleblowers said this month that researchers were told not to study the harmful effects of virtual reality on children. Meta also unveiled on Wednesday a new pair of Oakley-branded glasses called Vanguard aimed at athletes and priced at $499. The device integrates with fitness platforms such as Garmin and Strava to deliver real-time training stats and post-workout summaries and offers nine hours of battery life. It will be available starting on October 21. It also updated its previous Ray-Ban line, which does not have a built-in display but now offers almost twice the battery life of the earlier generation and a better camera at $379, higher than the previous generation's $299 price. While analysts do not expect the Display glasses to post strong sales, they believe it could be a step toward the planned 2027 launch of Meta's Orion glasses. Meta unveiled a prototype of that last year and Zuckerberg described it as "the time machine to the future." Forrester analyst Mike Proulx said the Display debut reminded him of Apple's introduction of a watch as an alternative to the smartphone. "Glasses are an everyday, non-cumbersome form factor," he said. Meta will still have to convince people that the benefits are worth the cost, he said, but "there's a lot of runway to earn market share." All the devices have existing features such as Meta's AI assistant, cameras, hands-free control and livestreaming to the company's social media platforms including Facebook and Instagram. Zuckerberg's demos of the new Display glasses did not all go as planned, with a call to the glasses failing to go through, for instance. "I don't know what to tell you guys," Zuckerberg said. "I keep on messing this up." The crowd cheered in support. "It's great value for the tech you're getting," Jitesh Ubrani, research manager for IDC's Worldwide Mobile Device Trackers, said of the Display glasses. But the software will need to catch up. "Until we get there, it's not really a device that the average consumer might know about or care to purchase," Ubrani said. IDC forecasts worldwide shipments of augmented reality/virtual reality headsets and display-less smart glasses will increase by 39.2 percent in 2025 to 14.3 million units, with Meta driving much of the growth thanks to demand for the cheaper Ray-Bans it makes with Ray-Ban owner EssilorLuxottica. Reporting by Aditya Soni and Echo Wang in Menlo Park, California; Writing by Peter Henderson; Editing by Sayantani Ghosh, Matthew Lewis and Tom Hogue. The extended deadline for the 2025 Inc. Best in Business Awards is this Friday, September 19, at 11:59 p.m. PT. Apply now.
[34]
Meta unveils Ray-Ban smart glasses with integrated display: A glimpse into the future
Meta has unveiled the new Ray-Ban Display. The glasses were unveiled at last night's Meta Connect. Meta has officially stepped into a new era of wearable technology with the launch of the new Ray-Ban Smart Glasses featuring an integrated display. Unveiled at the recent Meta Connect conference, this groundbreaking product marks Meta's first consumer-ready smart glasses to incorporate a visual display, aiming to seamlessly blend digital information with everyday life. These innovative smart glasses are packed with features designed to enhance daily interactions. At its core is a small, high-resolution, full-colour display projected onto the inside of the right lens, offering a subtle yet effective way to view information. Control is ingeniously handled by the new Meta Neural Band, a wristband utilising electromyography (EMG) to interpret "barely perceptible movements" and gestures, providing a truly hands-free experience. Wearers can leverage Meta AI with Visuals to receive answers, step-by-step instructions, real-time translations, and live captions. Communication is streamlined, allowing users to check messages from popular apps like WhatsApp and Instagram, and even make or receive live video calls, sharing their perspective with others. For the on-the-go individual, the glasses provide turn-by-turn walking directions, while a built-in camera functions as a viewfinder for previewing and capturing photos and videos. The Meta Ray-Ban Smart Glasses with display are priced at $799, which includes both the glasses and the essential Meta Neural Band. Anticipation builds as they are set to go on sale in the US starting September 30, 2025. Initially, these cutting-edge glasses will be exclusively available in a select number of physical retail locations, including Best Buy, LensCrafters, Sunglass Hut, and Ray-Ban stores. A broader global rollout, encompassing Canada, France, the UK, and Italy, is planned for early 2026. Unlike the previous Meta Ray-Ban smart glasses, which primarily focused on capturing photos and videos and basic audio functions, the new model introduces a critical visual element with its integrated in-lens display. This display capability transforms the glasses from a sophisticated camera and audio device into a true augmented reality experience, allowing for visual AI interactions, navigation, and visual communication directly within the wearer's field of view. With the launch of these display-enabled smart glasses, Meta is clearly pushing the boundaries of wearable technology, offering a compelling vision of a future where digital information is intuitively integrated into our physical world.
[35]
Meta Ray-Ban Display AI Glasses : The Future of Smart Eyewear?
What if your next pair of glasses didn't just help you see better, but helped you live better? Meta's latest innovation, the Meta Ray-Ban Display AI glasses, is turning this vision into reality. Imagine a world where your eyewear doubles as a personal assistant, translating conversations in real time, amplifying voices in noisy spaces, and even displaying high-resolution visuals directly in your field of view. With features like neural interface controls and AI-powered live interactions, these glasses are more than a tech gadget, they're a glimpse into the future of how we connect, communicate, and navigate the world. Bold, stylish, and packed with innovative technology, Meta's new AI glasses may just redefine what it means to wear smart. In this coverage, AI Grid explore the new features that make the Ray-Ban Meta glasses and their sportier sibling, the Oakley Meta Vanguard, stand out in the ever-evolving landscape of wearable technology. From immersive AI experiences to fitness tracking and seamless integration with your daily life, these devices promise to deliver functionality without compromising style. But are they truly the fantastic option they claim to be? Whether you're a tech enthusiast, a fitness fanatic, or someone curious about the future of AI, these glasses offer something to spark your imagination. Let's unpack what makes Meta's latest release more than just eyewear, it's a bold step toward a smarter, more connected world. The Ray-Ban Meta glasses combine advanced technology with the timeless design of traditional eyewear. Central to their functionality is a high-resolution, full-color display embedded directly into the lens. This display, slightly offset in one eye, offers a sharp 42 pixels per degree resolution and up to 5,000 nits of brightness, making sure exceptional clarity in both indoor and outdoor settings. When not in use, the display becomes invisible, maintaining the classic aesthetic of the glasses. Interaction is made effortless through a neural interface wristband, which translates subtle muscle movements into commands. This innovative feature allows you to navigate menus, activate functions, or control the device with minimal effort. Additionally, the glasses feature Conversation Focus technology, which amplifies specific voices in noisy environments, making it easier to engage in conversations. Real-time subtitles and translation capabilities further enhance communication by breaking down language barriers. One of the most notable features is the ability to host live AI sessions. These glasses can process and respond to visual and auditory inputs in real time, functioning as a personal assistant. Whether you require contextual information, navigation, or on-the-go assistance, the Ray-Ban Meta glasses are designed to adapt to your needs seamlessly. For individuals with active lifestyles, the Oakley Meta Vanguard glasses are specifically designed to meet the demands of sports and fitness enthusiasts. Unlike the Ray-Ban Meta glasses, these do not include an integrated display. Instead, they focus on durability and functionality, making them ideal for physical activities. The glasses are equipped with an action camera, microphone, and speakers, allowing you to capture high-quality videos and audio during workouts or outdoor adventures. Advanced AI capabilities allow the glasses to track workout metrics such as speed, heart rate, and pace, seamlessly integrating with popular fitness platforms like Garmin and Strava. The auto-capture feature ensures that key moments are recorded without requiring manual input, while video stabilization guarantees smooth footage, even during intense activities. With an IP67 water resistance rating, the Oakley Meta Vanguard glasses are built to withstand sweat and water exposure, making sure reliability in various conditions. Their optimized battery life supports extended use, making them a dependable companion for rigorous activities. Here is a selection of other guides from our extensive library of content you may find of interest on AI-powered glasses. Meta has also introduced upgrades to its existing Ray-Ban glasses, focusing on enhancing battery life and camera performance. The new models now support 3K video recording, delivering sharper resolution and smoother visuals. These improvements make the glasses a versatile tool for capturing high-quality content, whether you're documenting daily moments or creating professional-grade videos. The enhanced features ensure that these glasses remain a practical and stylish choice for users seeking advanced functionality. Meta's vision for wearable technology extends beyond convenience, aiming to integrate AI into everyday life in meaningful ways. By positioning glasses as a platform for personal superintelligence, Meta is creating tools that provide real-time AI assistance and interaction. These devices are not merely accessories; they represent a step toward a more connected and intelligent future. The new Meta Ray-Ban Display AI glasses show Meta's commitment to innovation. From neural interfaces and high-resolution displays to fitness tracking and video stabilization, these features highlight the potential of wearable AI technology to enhance how you live, work, and play. As Meta continues to push the boundaries of what is possible, these glasses stand as a significant milestone in the evolution of wearable technology.
[36]
Meta Selling "First Serious" Smart Glasses for $799 | PYMNTS.com
By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions. The Meta Ray-Ban Display, introduced by CEO Mark Zuckerberg at an event Wednesday (Sept. 17) and set to sell for $799, features a screen that can display text messages, video calls, photos and the results of queries to Meta's AI service. "And it isn't on all the time -- it's designed for short interactions that you're always in control of," Meta's announcement said. "This isn't about strapping a phone to your face. It's about helping you quickly accomplish some of your everyday tasks without breaking your flow." Meta says each pair comes with the Meta Neural Band, an EMG wristband that translates the signals created by the user's muscles into commands for the glasses. Speaking to Bloomberg News before launch, Meta Chief Technology Officer Andrew Bosworth, described the glasses as "the first serious product" of their kind. The report notes that the launch is part of Meta's push to develop its own line of consumer technology products as it competes with Google and Apple. "This feels like the kind of thing where you can start to keep your phone in your pocket more and more throughout the day," Bosworth said. He added that while phone isn't going away, glasses provide a more convenient way to interact with its most popular features. "Glasses are the ideal form factor for personal super intelligence because they let you stay present in the moment while getting access to all of these AI capabilities to make you smarter, help you communicate better, improve your memory, improve your senses," Zuckerberg said at the event. The glasses, set to go on sale Sept. 30, will be sold by Ray-Ban, Lenscrafters, Best Buy and a limited number of Verizon stores. PYMNTS explored the role AI plays in efforts to develop smart glasses earlier this year in an interview with David Jiang, a former Google executive and CEO of smart glass company Viture. AI makes smart glasses "more intuitive and practical for everyday use, whether you're streaming, gaming or navigating the world around you," Jiang told PYMNTS. Research by PYMNTS Intelligence has shown people often use connected devices to multitask, something smart glasses allow them to do. And this versatility is key. Jiang said when he was at Google, he saw firsthand why Google Glass didn't take off. "People didn't have a reason to wear it every day." He believes it's different today. "5G is here. Cloud gaming is mainstream. Streaming services are everywhere. Consumers want a better way to experience content without being tethered to a small screen."
[37]
Meta Connect 2025: Ray-Ban Gen 2, Ray-Ban Display with Neural Band, and Oakley Meta Vanguard AI glasses announced
Meta has expanded its AI glasses lineup at Connect 2025, introducing Ray-Ban Meta (Gen 2), Meta Ray-Ban Display, and Oakley Meta Vanguard, bringing improved cameras, displays, battery life, and innovative control with AI and EMG technology. The second-generation Ray-Ban AI glasses, created in partnership with EssilorLuxottica, build on the first generation while incorporating key upgrades: The Meta Ray-Ban Display introduces a new AI glasses category with an in-lens display and wrist-based EMG control: Designed for sports and high-intensity activities, Oakley Meta Vanguard integrates AI, fitness tracking, and audio features: These products combine video capture, AI features, fitness tracking, and intuitive control to provide hands-free interaction for daily tasks, communication, and sports performance. Speaking at Connect 2025, Mark Zuckerberg, CEO of Meta, said:
[38]
Meta launches smart glasses with built-in display
Image: Meta Meta Platforms on Wednesday launched its first consumer-ready smart glasses with a built-in display, seeking to extend the momentum of its Ray-Ban line, one of the early consumer hits of the artificial intelligence era. CEO Mark Zuckerberg showed off the Meta Ray-Ban Display and a new wristband controller, receiving applause at Meta's Connect event despite some demo problems. Meta has tasted success with its smart glasses, and Zuckerberg described them as the perfect way for humans to reach for the AI promise of "superintelligence." "Glasses are the ideal form factor for personal superintelligence, because they let you stay present in the moment while getting access to all of these AI capabilities that make you smarter, help you communicate better, improve your memory, improve your senses, and more," Zuckerberg said. The new Display glasses have a small digital display in the right lens for basic tasks such as notifications. They will start at $799 and be available on September 30 in stores. Included in the price is a wristband that translates hand gestures into commands such as responding to texts and calls. The launch at Meta's annual Connect conference for developers, held at its Menlo Park, California, headquarters, is its latest attempt to catch up in the high-stakes AI race. While the social media giant has been at the forefront of developing smart glasses, it trails rivals such as OpenAI and Alphabet's Google in rolling out advanced AI models. Zuckerberg has kicked off a Silicon Valley talent war to poach engineers from rivals and promised to spend tens of billions of dollars on cutting-edge AI chips. The new glasses come as Meta is facing scrutiny over its handling of child safety on its social media platforms. Reuters reported in August that Meta chatbots engaged children in provocative conversations about sensitive topics, while whistleblowers said this month that researchers were told not to study the harmful effects of virtual reality on children. Oakley-branded glasses introduced by Meta, aimed at athletes Meta also unveiled on Wednesday a new pair of Oakley-branded glasses called Vanguard aimed at athletes and priced at $499. The device integrates with fitness platforms such as Garmin and Strava to deliver real-time training stats and post-workout summaries and offers nine hours of battery life. It will be available starting on October 21. It also updated its previous Ray-Ban line, which does not have a built-in display but now offers almost twice the battery life of the earlier generation and a better camera at $379, higher than the previous generation's $299 price. While analysts do not expect the Display glasses to post strong sales, they believe it could be a step toward the planned 2027 launch of Meta's "Orion" glasses. Meta unveiled a prototype of that last year and Zuckerberg described it as "the time machine to the future." Forrester analyst Mike Proulx said the Display debut reminded him of Apple's introduction of a watch as an alternative to the smartphone. "Glasses are an everyday, non-cumbersome form factor," he said. Meta will still have to convince people that the benefits were worth the cost, he said, but "there's a lot of runway to earn market share." All the devices have existing features such as Meta's AI assistant, cameras, hands-free control and livestreaming to the company's social media platforms including Facebook and Instagram. Zuckerberg's demos of the new Display glasses did not all go as planned, with a call to the glasses failing to go through, for instance. "I don't know what to tell you guys," Zuckerberg said. "I keep on messing this up." The crowd cheered in support. "It's great value for the tech you're getting," Jitesh Ubrani, research manager for IDC's Worldwide Mobile Device Trackers, said of the Display glasses. But the software will need to catch up. "Until we get there, it's not really a device that the average consumer might know about or care to purchase," Ubrani said.
[39]
Meta Ray-Ban Display Glasses Launched: Check Price, Specs, AI Features & More
Meta has launched its first smart glasses with a screen integrated into the frame. It marks a major advancement towards making AI-driven eyewear a common consumer item. The newly launched Meta Ray-Ban Display costs $799 and will be available starting September 30. These enable users to read text messages, make video calls, view turn-by-turn maps, access , control music, capture photos, and even utilize a digital camera viewfinder. A new neural wristband works as the primary control interface, reading hand movements such as pinches, swipes, and twists. With a further update, people can call up Meta's AI with a double tap, control volume in mid-air, or even 'type' words in space. Another highlight is live captioning with real-time translations, positioning the glasses as a tool for accessibility and global communication. Video calling shows both the contact and the wearer's perspective simultaneously.
[40]
Meta unveils AI-powered smart glasses with display and neural wristband | BreakingNews
Meta's newest artificial-intelligence-powered smart glasses include a tiny display and can be controlled by a neural wristband that lets you manage it with "barely perceptible movements", chief executive Mark Zuckerberg has said. Mr Zuckerberg continues to promote the glasses as the next step in human-computer interactions -- beyond keyboards, touch screens or a mouse. "Glasses are the only form factor where you can let AI see what you see, hear what you hear," and eventually generate what you want to generate, such as images or video, Mr Zuckerberg said, speaking at the tech giant's California headquarters. The glasses, called Meta Ray-Ban Display, will be available on September 30 and cost 799 US dollars (£586). Mike Proulx, research director at Forrester, said Meta's latest reveal is "reminiscent of when the Apple Watch first debuted as an alternative to the smartphone". "But what these glasses do is bring more utility to consumers in a single device. Unlike VR headsets, glasses are an everyday, non-cumbersome form factor," the analyst said. "However, the onus is on Meta to convince the vast majority of people who don't own AI glasses that the benefits outweigh the cost. The good news? There's a lot of runway to earn market share." Meta also updated its original, display-less Ray-Ban glasses to have a better battery life, which Meta says lasts eight hours with typical use, nearly twice as long as the previous model. An upcoming feature, called "conversation focus", will amplify the voice of the person the user is speaking to and help drown out background noise. This will be available on the older version of the glasses too, as a software update, Mr Zuckerberg said. Meta has also added German and Portuguese to the gadget's live translation capabilities. The new model costs 379 dollars (£278), and the previous model now costs 299 dollars (£219). The company also unveiled a new set of AI-powered glasses for athletes, called the Oakley Meta Vanguard, which Meta says is specifically for "high-intensity sports" and can be integrated with Garmin devices to give users feedback about their workouts such as heart rate and statistics. For instance, a runner could ask "Hey Meta, what's my heart rate?" and get a voice response through the glasses. It also auto-captures video clips when the user hits key milestones or ramps up their heart rate, speed or elevation. For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds The glasses will cost 499 dollars (£366) and go on sale on October 21. While the company has not disclosed sales figures for the glasses, it said they have been more popular than expected. "For more than a decade, Zuckerberg's long-term vision with Oculus and the Metaverse has been that glasses and headsets will blur the lines between physical and digital worlds," Forrester analyst Thomas Husson said. "After many false starts, the momentum to move beyond an early adopter niche is now." Meta teased a prototype for Orion, which Mr Zuckerberg called "the most advanced glasses the world has ever seen", last year -- but these holographic-augmented reality glasses are still years away from being on the market. Like other tech companies, Meta has been making massive investments in AI development and hiring top talent at eye-popping compensation levels. In July, Mr Zuckerberg posted a note detailing his views on "personal superintelligence" that he believes will "help humanity accelerate our pace of progress". While he said that developing superintelligence was now "in sight", he did not detail how this would be achieved or exactly what "superintelligence" meant. The abstract idea of "superintelligence" is what rival companies call artificial general intelligence, or AGI. Mr Zuckerberg has said he believes AI glasses are going to be "the main way we integrate superintelligence".
[41]
EssilorLuxottica presents Meta Ray-Ban Display
At the Meta Connect event, EssilorLuxottica and Meta Platforms announced the expansion of their range of wearables, notably with the Meta Ray-Ban Display glasses, presented as the most futuristic product to come out of their collaboration. Meta Ray-Ban Display combines AI glasses with a color display integrated into the right lens for the first time, discreetly displaying incoming messages, photo previews, video calls, and AI-generated visual cues. Equipped with Transitions lenses, they can be customized with vision correction and come with the Meta neural wristband, which uses electromyography to control the glasses with subtle hand and finger movements. These glasses will be available in select stores in the US starting September 30, starting at $799. Sales will then be extended to Canada, France, Italy, and the UK in early 2026.
[42]
EssilorLuxottica unveils new AI-powered glasses
At the Meta Connect event, EssilorLuxottica and Meta Platforms announced the expansion of their wearables portfolio with new products, including Oakley Meta Vanguard AI glasses, which "usher in the era of Athletic Intelligence." Oakley Meta Vanguard combines the brand's iconic PRIZM lenses with a powerful 12-megapixel ultra-wide camera (122° field of view) and high-performance ear-free speakers with advanced noise reduction. They are also unveiling a new generation of Ray-Ban Meta glasses, which offer longer battery life (up to eight hours) and a 12-megapixel ultra-wide camera capable of shooting high-quality 3K ultra-HD video. Pre-orders for Oakley Meta Vanguard are open today on Oakley.com and Meta.com for $499 (€549). Ray-Ban Meta (Gen 2) glasses are available today on Ray-Ban.com and Meta.com starting at $379 (€419).
[43]
Meta launches $499 Oakley smart glasses
MENLO PARK, California (Reuters) - Meta on Wednesday launched $499 Oakley-branded smart glasses for athletes that come with a centered action camera, louder speakers and better water resistance, expanding its wearables beyond the Ray-Ban line that has been an early AI-era hit. Dubbed Oakley Meta Vanguard, the glasses integrate with Meta's AI app and fitness platforms such as Garmin and Strava to deliver real-time training stats and post-workout summaries. They will come with nine hours of battery life and roll out first in countries such as the U.S. and Canada starting Oct. 21. The device was unveiled at Meta's annual Connect conference, where the company is also expected to launch its first consumer-ready smart glasses with a built-in display. Those glasses will likely be named Celeste and be paired with Meta's first wristband for hand-gesture controls, but their expected $800 price may dissuade some buyers, analysts said. CNBC has reported the glasses could feature Prada branding. CEO Mark Zuckerberg walked on stage appearing to wear a pair, but put them away immediately. Celeste is expected to include a small digital display in the right lens for basic tasks such as notifications. It will also offer features available on its existing Ray-Ban and Oakley smart glasses such as an AI assistant, cameras, hands-free control and livestreaming to its social media platforms. They mark Meta's latest bid to stay relevant in the AI race, where it trails rivals such as OpenAI and Google in rolling out advanced models. Zuckerberg kicked off a billion-dollar talent war earlier this year to poach engineers from rivals and has promised to spend tens of billions of dollars on AI chips. The launch at Meta's annual Connect conference, held at its Menlo Park, California headquarters, comes amid scrutiny over Meta's handling of child safety on its social media platforms. Reuters reported in August that Meta chatbots engaged children in provocative conversations about sex and race, while whistleblowers said earlier this month researchers were told not to study harms of virtual reality to children. For Zuckerberg, whose massive bet on the metaverse has so far generated tens of billions of dollars in losses, smart glasses are the ideal device for superintelligence - a concept where AI surpasses human intelligence in every possible way. While experts and executives disagree on whether and when superintelligence can be achieved, analysts said Meta's new glasses are a step toward the planned 2027 launch of its "Orion" prototype, unveiled last year and described by Zuckerberg as "the time machine to the future." "It wasn't long ago that consumers were introduced to AI on glasses and in recent quarters brands have also begun to include displays, enabling new use cases," said Jitesh Ubrani, research manager for IDC's Worldwide Mobile Device Trackers. "However, consumer awareness and product availability of AI glasses with display remains limited. This will change as Meta, Google, and others launch products in the next 18 months." IDC forecasts worldwide shipments of augmented reality/ virtual reality headsets and display-less smart glasses will grow 39.2% in 2025 to 14.3 million units, with Meta driving much of the growth thanks to demand for the Ray-Bans it makes with EssilorLuxottica. (Reporting by Aditya Soni and Echo Wang in Menlo Park, California; Editing by Sayantani Ghosh and Matthew Lewis)
[44]
Meta unveils AI glasses with in-lens display and wristband controller: Features, price and more
Each pair comes with a Meta Neural Band, which lets you control the glasses using subtle hand gestures. At its Connect event 2025, Meta showcased what it believes is the "next evolution in AI glasses": Meta Ray-Ban Display and Meta Neural Band. Meta Ray-Ban Display glasses feature an in-lens display and are designed to help you stay connected, informed and productive without constantly reaching for your phone. Each pair comes with a Meta Neural Band, which lets you control the glasses using subtle hand gestures. According to Meta, Ray-Ban Display is the "first product that takes microphones, speakers, cameras, and a full-color display backed with compute and AI -- and puts it all together in a single device that's stylish and comfortable." Also read: Nothing to launch first AI-native devices in 2026, CEO hints they won't be phones The Meta Ray-Ban Display glasses feature an in-lens display that lets you check messages, view directions, see translations, and interact with Meta AI without pulling out your phone. The display appears only when needed and is placed off to the side so it doesn't block your view. Meta says the goal isn't to replace your phone but to help you quickly complete everyday tasks without breaking your flow. Features include: Also read: OpenAI to build age-prediction system, restrict flirtation and suicide talk for teens The Meta Ray-Ban Display with Neural Band comes with a starting price of $799 and will hit shelves on September 30 at select US retailers. Meta Ray-Ban Display and the accompanying Meta Neural Band come in Black and Sand colour options.
Share
Share
Copy Link
Meta introduces Ray-Ban Display smart glasses with built-in screen and neural wristband, marking a significant step in wearable AI technology. The new product aims to reduce smartphone dependency and enhance user interaction with AI.
At Meta Connect 2025, CEO Mark Zuckerberg unveiled the company's latest foray into wearable technology: the Meta Ray-Ban Display smart glasses. This new product represents a significant leap towards Zuckerberg's vision of glasses becoming "the next computing platform device"
3
. The Ray-Ban Display glasses, along with two other new smart glasses models, showcase Meta's commitment to leading the charge in AI wearables1
.Source: Geeky Gadgets
The Meta Ray-Ban Display glasses stand out with their built-in high-resolution screen in the right lens, offering a 20-degree field of view and 42 pixels per degree
3
. Despite the advanced technology, the glasses maintain a relatively sleek design, weighing just 69 grams2
.One of the most innovative aspects of the glasses is the Meta Neural Band, a wristband controller that uses surface electromyography (sEMG) to detect subtle hand gestures
1
. This allows users to interact with the glasses' interface through intuitive movements, even with their hand at their side or in a pocket3
.Source: Digit
The Ray-Ban Display glasses come equipped with a range of AI-powered features:
5
.5
.2
.5
.Source: Sky News
Related Stories
While Meta is optimistic about the future of smart glasses, current adoption rates remain relatively low. Forrester's Consumer Technology Insights Survey 2025 indicates that only 17% of US online adults have used smart glasses
4
. However, Meta's existing Ray-Ban smart glasses have reportedly sold in the single-digit millions, with sales increasing significantly year-over-year3
.The Ray-Ban Display glasses face several challenges, including skepticism towards AI technology, privacy concerns, and the perceived high cost of $799
4
. Meta will need to demonstrate clear utility and value to overcome these hurdles and drive widespread adoption.As Meta pushes forward with its smart glasses technology, competition in the market is heating up. Google and Snap are expected to release their own AI glasses in 2026, with Apple rumored to enter the market as early as 2027
3
. This increasing competition may drive further innovation and potentially lead to more affordable options for consumers.Zuckerberg envisions a future where the majority of the 1 to 2 billion people who wear glasses daily will transition to AI-enabled smart glasses within the next five to seven years
3
. While this may seem ambitious, the Ray-Ban Display glasses represent a significant step towards realizing this vision and potentially ushering in a post-smartphone era of computing.🟡 prioritizing=🟡Summarized by
Navi
[2]
[4]