3 Sources
[1]
I've worn the Meta Ray-Bans for half a year, and these 5 features still amaze me
I invested in a pair of Meta Ray-Bans several months ago, and I haven't regretted it since. As a tech reporter, I attend events and conferences with other gadget aficionados. A telltale sign that a product is good is when reporters and attendees rave about it and even use their money to buy it. And no matter where I went, I couldn't escape the glowing Meta Ray-Ban reviews. Also: Meta's new $399 Oakley smart glasses beat the Ray-Bans in almost every way These experiences weren't just ones that I noticed; sales told the same story. EssilorLuxottica, the glasses producer, reported in February that two million pairs of the Meta Ray-Bans have been sold since their October 2023 launch. After hearing so many rave reviews of the Meta Ray-Bans, I got a pair of my own. As an eyeglass wearer and an openly photo-obsessed Gen-Zer, these smartglasses seemed like a perfect fit. After wearing them for five months, including at events such as CES, I have some favorite features. Surprisingly, my top picks are not what I initially thought they would be. 1. The audio When considering whether I wanted the glasses, all the apparent benefits involved the dual cameras. I never expected my favorite feature to be the audio. The smartglasses are equipped with five microphones and two custom-built open-ear speakers that are surprisingly performant. I often forget that I am wearing smart glasses until I start scrolling on social media and hear audio. The audio is honestly as good as using any earbuds with ambient mode or open-ear headphones, such as the Shockz OpenFit. There is sound leakage, but as someone who frequently uses open-ear headphones, I don't have any shame. Also:,I tested Meta's transparent Ray-Ban smart gglasses, and they're a near-perfect accessory for me Another major advantage is how good the microphones are. My biggest form of communication is audio message exchanges, and I was nervous that the glasses would make it inconvenient or difficult to send them. Instead, the glasses pick up my voice naturally, with no issues. In fact, they're better than some earbuds that I have tested and worn. 2. AI-enabled translation As an artificial intelligence (AI) reporter, I have tested several AI translation devices, and my email inbox is flooded with pitches from companies promoting them. These products are typically costly and only fulfill one task. For that reason, I was pretty excited to see that the Meta Ray-Bans include an AI-powered live translation feature for free. Also: Your Meta Ray-Bans just got a major feature upgrade for free - and it feels surreal Although AI translation is not something you would reach for every day, in situations where you need help, it's nice to know that you have an assistant close at hand. Plus, as someone who grew up in a Spanish-speaking household and is fluent, I tried the tech with my mom in real time, and was impressed at how accurately it translated, how quick it was to get started, and how easy it was to set up. 3. Easy access to Meta AI (bonus, it's multimodal) Now, I know most days you may not use an AI voice assistant on your phone, but what makes the tech great on these smartglasses is accessibility. The best part about Meta Ray-Bans is that you often forget you are wearing them. So, you can go about your day, washing dishes, folding clothes, sweeping the floor, and commuting without your phone while still having access to important features, such as push notifications, music, and assistance. Also: The best AI chatbots of 2025: ChatGPT, Copilot, and notable alternatives For example, if you are grilling outside and your phone is in the house, you can quickly ask, "How long should I cook a burger on the grill before flipping?" The AI-enabled assistant is multimodal, using the feed from the cameras at the top of the lenses as context. In that scenario, you could say something like, "How long before I flip this?", and the assistant would use the visual context to provide an answer. Because of the position of the cameras, the glasses can see what you see, which can be even more helpful. Whenever I call on this feature, it never gets old. 4. Filming This may seem like an obvious perk, but if you create lots of visual content, the Meta Ray-Bans are a cool thing to add to your arsenal. Because of the positioning of the cameras in the glasses, the footage you get is unique, as it is literally from your point of view. For example, if you are on an amusement park ride, walking through a venue, or cooking a meal, people see exactly what you are experiencing. This point-of-view experience is similar to a GoPro device, and, despite the inferior quality, if you are like me and not an adrenaline junky, this option may be more practical. Also: Apple's Meta Ray-Bans killer is only one of four major launches in 2027 - here's the list Another benefit is the button's positioning. It is prominently and conveniently displayed at the top of the right eye, so there is no messing around when you see something you want to shoot. This capability makes it even quicker than taking your phone out and clicking the record button. 5. Form factor Last but not least, I wouldn't be able to talk about the Meta Ray-Bans without talking about the form factor. In the photo above, I have them side-by-side next to the pair I wear every day, and you can see that they look extremely similar, despite packing many components such as cameras, microphones, speakers, and a battery. Most people can't even tell that the Ray-Bans are smart glasses, unless you are a fellow techie. One pain point The battery dies a bit too fast for my liking. In my experience, I typically can make them last around six hours. However, that often means the power on the glasses lasts halfway through my day, which isn't ideal. That's why the newly announced Oakley Meta glasses are so appealing; they're rated to last upwards of eight hours. There is also no way to charge the Ray-Ban glasses while they are on your face, though I'm sure an additional component for that would only add to their existing heft. My workaround is to wear the Meta Ray-Bans when I know I need to use the features, rather than every day. Get the morning's top stories in your inbox each day with our Tech Today newsletter.
[2]
Without a HUD, Smart Glasses Don't Seem Smart to Me
Meta has announced yet another range of its smart glasses to follow the Ray-Ban models we've seen before and, while I think the technology is cool, the lack of any sort of HUD or visual feedback system makes them less than appealing in my opinion. In fact, given what these glasses actually do, I wonder if they needed to be glasses at all, and if some version of the same functionality could be offered in a different form factor. Smart Glasses Are Coming Fast (and Affordable) We've come a long way from the early days of smart glasses, such as the pioneering Google Glass. Not only is society more ready than before to have people walk around with recording devices on their faces, the actual usability is much better. Most importantly, the Ray-Ban (and now Oakley) collabs that Meta has been doing with its glasses are a stroke of genius, since now your smart glasses don't make you look like a dork. Well, no more than wearing preppy brands of shades already do. When my colleague Tyler Hayes reviewed the Meta Ray Ban smart glasses back in 2023, he was suitably impressed. This was a product ready for everyday use and at a price that wasn't far off what people pay for fancy branded shades anyway. So it's a win-win situation, right? Now Meta has announced the Oakley Meta HSTN series that go up for preorder on the 11th of July, 2025, and while they are an improvement (i.e. these are IPX4) and aimed at outdoorsy athletic types, the basic feature set is still the same. Ray-Ban Meta Smart Glasses Brand Ray-Ban Speakers 2 custom-built speakers Embraced by the next generation of culture makers, its journey continues with AI-enhanced wearable tech. Listen, call, capture, and live stream features are seamlessly integrated within the classic frame. See at Amazon $299 at Meta $299 at Ray-Ban Expand Collapse Video Recording and Audio Is About It If you get right down to it, these glasses, apart from being glasses, are really just a pair of headphones and some cameras with mics. Sure, there might be a physical control pad too, but in essence you're melding sunglasses with mics and earphones. That's not nothing, of course. It's an ergonomic and useful set of tools in a form factor that's convenient if you're going to wear glasses one way or another. You can't help but give credit where it's due when it comes to the improvements to bulk, battery life, and overall usability. It's just that I don't quite feel like this specific set of features is quite worth dishing out hundreds of dollars for yet. They Have AI The main reason these smart glasses are now actually smart comes down to the relatively recent revolution in artificial intelligence. Tethered to a device like a smartphone, which is in turn tethered to the cloud, you can speak to an AI using the mics, listen to its feedback using the headphones, and let it see what you see via the onboard cameras. The latest generation of AI assistants are now so good that they can do things like translate text in real time, translate the spoken word, identify objects, and otherwise help you navigate the world. Related What's the Best AI Chatbot? Compare Side-By-Side With This Tool Pit the AI bots against themselves. Posts 1 The thing is, you can already do this with your existing smartphone and its suite of sensors. The only real difference here is that the glasses allow for this to be hands-free and continuously. Whereas with your phone you need to consciously take it out and point the camera at something as needed. That's definitely a valid use case, but I think for most people that doesn't turn smart glasses into a killer, must-have device the way smartphones have become. Without AR or at Least a HUD, What's the Point? For me, what made devices like Google Glass interesting in the first place was the addition of a display -- and that's what's lacking in these new smart glasses. The Google Glass had a prism through which you could see things like map directions and incoming messages. With the power of modern AI technology, having smart glasses with some sort of display could finally realize the dream of giving us all terminator vision. It would be the true form of augmented reality. The challenge, of course, is getting that display technology into a standard-looking pair of glasses like these. Again, the main reason these Meta smart glasses are avoiding ridicule and gaining traction is that they look like normal glasses. The Google Glass, in comparison, looked like you were cosplaying a Borg from Star Trek. Related I've Watched Every Star Trek TNG Episode -- These Are the 10 Best Now with 10% more Riker's beard. Posts 22 I get that we're still some way off having tiny displays that could be added to these smart glasses (without blowing up the price), but until this last piece of the puzzle makes it into this next generation of wearable, I'm just not that excited.
[3]
Do Ray-Ban Meta smart glasses work as an accessibility device?
Amanda Yeo is an Assistant Editor at Mashable, covering entertainment, culture, tech, science, and social good. Based in Australia, she writes about everything from video games and K-pop to movies and gadgets. The Ray-Ban Meta smart glasses were first unveiled in 2023, the result of a collaboration between sunglasses company Ray-Ban and tech giant Meta, owner of Facebook and Instagram. Appealing to the fashion-conscious tech nerd, the voice-operated wearable not only allows users to take photos and make calls hands-free, but can also use AI to describe a user's surroundings. Though the Ray-Ban Meta was not designed as an accessibility device, its features may cause some to wonder whether it could moonlight as an accessibility device for people with low or limited vision. As such, Mashable spent a few days testing whether the gadget could be reappropriated for this purpose. Unfortunately, while it is a novel device, relying on the Ray-Ban Meta smart glasses to help you navigate the world would be foolhardy at best. The Ray-Ban Meta glasses boast a relatively compact form factor which looks very much like Ray-Ban's eyewear designs, with customers able to choose between Wayfarer, Skyler, and Headliner styles. The glasses utilise Meta's large language model Meta AI to answer users' queries, with a five-microphone system which can pick up voice commands while suppressing background noise. They also have small open-ear speakers designed to minimise audio leakage, and include a built-in 12 MP camera which can take photos and record video. Despite its high-tech innards, the most noticeable visible difference between the Ray-Ban Meta and standard Ray-Bans is the missing metallic detail at the temples. Instead, the Ray-Ban Meta substitutes in a camera lens on the left and a notification light on the right (this activates when a photo or video is being taken, an effort to address concerns about privacy and covert surveillance). At around 49 grams depending upon the frame selected, the Ray-Ban Meta's weight isn't outside what one may expect for a pair of sunglasses, though it's certainly on the heavier side. It is slightly bulkier than standard Ray-Bans, particularly at the arms (the right of which includes touchpad controls), but still streamlined enough that observers likely won't notice. Ray-Ban Meta glasses are targeted at the average consumer, rather than catering specifically to people with disabilities. Even so, Meta does state that the glasses can be used to help people with "reduced vision, hearing, or mobility by offering the ability to perform tasks hands free." Users can also have their Ray-Ban Meta glasses fitted with prescription lenses, with the option to upload a valid prescription with a power ranging between -6.00 and +4.00 when ordering a pair. The Ray-Ban Meta glasses are primarily operated by voice commands to Meta AI, requiring the app to be installed to your phone and connected to the device. The glasses can also be connected to Messenger, WhatsApp, or a users' phone via said app, which is needed to enable users to send messages and make calls using voice commands. This may help users conduct such tasks without having to look at their phone screen, however it's worth noting that both iPhone and Android can already be operated directly via voice commands without Meta AI. Ray-Ban Meta users can also issue commands, such as asking Meta AI what they're looking at. The glasses will then take a photo and use Meta AI to analyse it, with AI-generated audio describing the scene to the user. Such images and conversation logs are saved to a user's History log in the Meta AI app, and can be shared to the public Discovery feed. Aside from this voice command functionality, the Ray-Ban Meta feature most specific to people with disabilities is its partnership with Be My Eyes. This free service connects users with low or limited vision to volunteers who will look through the Ray-Ban Meta's camera and describe the person's surroundings. According to Be My Eyes, it is "the first and only accessibility tech for blind or low vision users available on Meta AI Glasses." The Be My Eyes app does work without Ray-Ban Meta, with users simply pointing their phone cameras at whatever they want described. As such, people who primarily want to take advantage of this free service could just download the app to their phone rather than shelling out a few hundred dollars for the Ray-Ban Meta. Using Be My Eyes with the Ray-Ban Meta requires users to download the app and connect it to the Meta AI app anyway. However, the Ray-Ban Meta glasses do enable users to use Be My Eyes hands-free. They may help frame shots as well, as users merely have to direct their gaze toward whatever it is that they want described to them. Whether it's worth picking up the Ray-Ban Meta to assist in accessibility may depend on how often a user utilises Be My Eyes. Even so, Meta states that the glasses have just four hours of battery life with moderate usage, which means that wearing them all day in order to repeatedly use Be My Eyes may not be realistic. Be My Eyes is also only available in the U.S., UK, Canada, Australia, and Ireland, and only supports the English language. Unfortunately, aside from its Be My Eyes functionality, the Ray-Ban Meta glasses seem largely unsuitable as an accessibility aid. While Mashable found them an interesting novelty at least (though the Meta AI app's Discovery feed felt like the quiet death of humanity), relying on these glasses to help you navigate the world is an impractical proposition. As previously mentioned, the Ray-Ban Meta glasses can describe a user's surroundings when asked. However, such responses are relatively vague and don't appear useful for orienting yourself unless you're so lost that you can't tell whether you're in a car park or a playground. For example, Meta AI responded to one query by telling me that I was "looking at the interior of a train, specifically the seating area." While this was true, it wasn't terribly useful information, and missed the display I was facing, which indicated where the train was going. When asked to read a sign bearing a single word, Meta AI was able to do so. As such, it may be useful to help someone determine the appropriate bin to throw their waste in, for example. However, asking it to read an article open on a computer screen produced unsatisfactory results. Looking at the first paragraph of my colleague Belen Edwards's article "The 10 best TV shows of 2025 (so far), and where to stream them," I requested my Meta RayBan glasses read it to me. The result was a bizarre mix of text out of order, with some lines skipped altogether. When I scrolled down and asked it to continue reading, it would simply recite the text it had already read. Asking again on another day produced even less accurate results. Instead of reading the text, Meta AI offered a vague description of what it seemed to think the article was about. Repeated requests produced different results each time, with Meta AI sometimes even telling me "the text reads" before offering an inaccurate approximation of the text. Further tests on a later date showed improved accuracy, with Meta AI reciting much of the article visible on screen. Even so, it still took liberties with the text, inventing a headline and referencing fake shows "The Pilt" and "The Iceberg." After I scrolled down and asked Meta AI to continue reading, it stated that it "can only provide general information, and [is] not able to read articles in real-time." Being able to simply look at any screen and have Ray-Ban Meta glasses smoothly read it out would theoretically be a boon to many users with low or limited vision. Unfortunately, people who need such assistance would be better off relying on dedicated screen readers for now. Mashable's testing found the Ray-Ban Meta glasses' AI assistant also struggled when it came to matters that weren't literally in front of it. When we tried asking for the day's news headlines, Meta AI confidently offered a humorously incoherent response: "Here are the top three news. First, latest news and stories from around the world are available. Second, latest U.S. news updates are available. Third, latest news headlines are available." Repeating the question produced the same answer. Asking for a specific publication might get you actual news items, however they may not be from the outlet you requested. While Mashable didn't report on Jonathan Joss' death, Meta offered this news as the top headline on the site at time of testing. It then offered Mashable's coverage areas of "tech, culture, and everything in between" as a second ostensible headline, before again informing us that "the latest news headlines are available." Requests for the New York Times' headlines fared better, producing news items that the publication had reported on. However, the given headlines seemed to have been paraphrased, and the information supplied was outdated at best. For example, Meta AI stated that "Israel appears ready to attack Iran," however the first story on the New York Times' website was "Israel Says It Attacked Headquarters of Powerful Iranian Military Unit." Further, while Meta AI stated that 242 people had been killed in a plane crash in India, the death toll had already climbed to 270 days prior to our inquiry. I also tried asking Meta AI for a recipe for a vanilla cake. In response, it provided a list of ingredients and measurements which seemed to be in roughly the right proportions. However, it only partially fulfilled my request, as no instructions were provided. Once again, Meta AI demonstrated approximate knowledge of many things, but was still unable to offer useful, usable information. Meta AI also struggled with more personally immediate matters. While it did suggest a nearby restaurant when asked for a "good place to go and get dinner in Sydney," Meta AI stated that it would be open until 10 p.m. that day. In actuality, the restaurant had been closed for months, which was reflected both on their Google Maps listing and Instagram page. Despite there being hundreds of operating restaurants in the city, Meta AI somehow managed to select one that had shut down. The chatbot also fell short when asked to assist with travel plans. Requesting help getting around seems like an obvious and expected use of an AI assistant. Despite this, Meta AI was unable to assist when asked when the next train between two stations would arrive, stating, "I can't help with that yet, but I'm learning every day!" When asked for assistance with transport plans more generally, it told me to visit my local transport website to check the timetable. It couldn't advise if or when such features might be added either, telling me to check the Ray-Ban Meta Help Center. Asking whether a train station was wheelchair accessible was hit and miss, with Meta AI bizarrely responding to my first request by offering the address for a KFC. Fortunately, subsequent inquiries produced more relevant answers, however considering the quality of previous responses, users will probably feel uneasy about blindly trusting Meta AI's word for it. The Ray-Ban Meta smart glasses aren't primarily marketed as an accessibility device. Actual medical devices designed to assist people with low or limited vision typically retail for a significantly higher price. For example, an OrCam MyEye 3 Pro will drain your bank account to the tune of $4,490, which is over 10 times the price of the most expensive Ray-Ban Meta glasses. In light of this, it's unsurprising that the Ray-Ban Meta glasses underwhelm as an accessibility device for people with low or limited vision. While the Ray-Ban Meta glasses may assist users by enabling them to conduct tasks such as messaging, playing music, and taking photographs hands-free, they struggle when asked to interpret text in front of them and underperform when asked to provide information more generally. Like all generative AI algorithms, Meta AI simply can't replace going direct to reliable sources yourself. If you just want to take a few hands-free photos and calls, the Meta Ray-Bans may have you covered. However, this gadget wasn't designed to be an accessibility device, and certainly should not be relied upon as such.
Share
Copy Link
An in-depth look at Meta's Ray-Ban smart glasses, exploring their features, user experiences, and potential as an accessibility device, while also discussing their limitations and future prospects.
Meta's collaboration with Ray-Ban has produced a series of smart glasses that have garnered significant attention in the tech world. These wearables, which blend the classic Ray-Ban design with cutting-edge technology, have sold an impressive two million pairs since their October 2023 launch 1.
Source: Mashable
The Meta Ray-Ban smart glasses boast several noteworthy features that have impressed users:
Audio Quality: The glasses are equipped with five microphones and two custom-built open-ear speakers, delivering surprisingly good audio performance comparable to earbuds with ambient mode 1.
AI-Enabled Translation: The glasses offer free, AI-powered live translation, a feature that has proven accurate and easy to set up 1.
Meta AI Integration: Users can access Meta's AI assistant hands-free, which is particularly useful for tasks when your phone isn't readily available 1.
Point-of-View Filming: The glasses' camera positioning allows for unique, first-person perspective footage 1.
Sleek Design: The smart glasses maintain a form factor very similar to standard Ray-Ban models, making them indistinguishable from regular glasses to most observers 1.
Despite these impressive features, the Meta Ray-Ban smart glasses have some notable limitations:
Lack of HUD: The absence of a heads-up display or any visual feedback system limits the glasses' potential as a true augmented reality device 2.
Battery Life: Users report that the battery tends to die too quickly, lasting only about six hours with moderate use 1.
Limited Functionality: Some critics argue that the glasses' features could potentially be offered in a different form factor, questioning the necessity of the glasses format 2.
While not primarily designed as an accessibility tool, the Meta Ray-Ban smart glasses have shown some potential in this area:
AI Description: Users can ask the glasses to describe their surroundings, which could be helpful for those with visual impairments 3.
Be My Eyes Integration: The glasses can connect to the Be My Eyes app, allowing volunteers to see through the glasses' camera and describe the surroundings to users with low vision 3.
However, the effectiveness of these features for accessibility purposes is limited. The AI descriptions are often vague and not particularly useful for navigation, and the Be My Eyes integration, while innovative, is restricted by the glasses' short battery life and limited geographical availability 3.
Source: ZDNet
As smart glasses technology continues to evolve, there is potential for significant improvements. The integration of a heads-up display or more advanced visual feedback system could transform these devices into true augmented reality tools, enhancing their utility for both general users and those seeking accessibility solutions 2.
In conclusion, while the Meta Ray-Ban smart glasses represent an impressive blend of fashion and technology, they still have room for improvement to fully realize their potential as a revolutionary wearable device.
CoreWeave, an AI infrastructure provider, has announced a $9 billion all-stock acquisition of Core Scientific, a data center company. This strategic move aims to enhance CoreWeave's AI computing capabilities and eliminate substantial lease costs.
18 Sources
Business and Economy
13 hrs ago
18 Sources
Business and Economy
13 hrs ago
Isomorphic Labs, a subsidiary of Alphabet's Google DeepMind, is preparing to begin human clinical trials for drugs designed using artificial intelligence, marking a significant milestone in AI-powered drug discovery.
4 Sources
Science and Research
22 hrs ago
4 Sources
Science and Research
22 hrs ago
French tech giant Capgemini agrees to acquire US-listed WNS Holdings for $3.3 billion, aiming to strengthen its position in AI-powered intelligent operations and expand its presence in the US market.
11 Sources
Business and Economy
14 hrs ago
11 Sources
Business and Economy
14 hrs ago
Huawei's AI research division, Noah Ark Lab, strongly refutes claims that its Pangu Pro model copied elements from Alibaba's Qwen model, asserting independent development and adherence to open-source practices.
6 Sources
Technology
14 hrs ago
6 Sources
Technology
14 hrs ago
Groq, a US-based AI semiconductor startup, has established its first European data center in Helsinki, Finland, in partnership with Equinix, marking a significant step in its international expansion and efforts to meet the growing demand for AI services in Europe.
4 Sources
Business and Economy
13 hrs ago
4 Sources
Business and Economy
13 hrs ago