Curated by THEOUTPOST
On Tue, 17 Dec, 12:03 AM UTC
22 Sources
[1]
Ray-Ban Meta Smart Glasses get real-time visual AI and translation
Meta is rolling out two long-awaited features to its popular Ray-Ban Smart Glasses: real-time visual AI and translation. While it's just being rolled out for testing right now, the plan is that, eventually, anyone that owns Ray-Ban Meta Smart Glasses will get a live assistant that can see, hear, and translate Spanish, French, and Italian. It's part of the v11 update that cover the upgrades Meta described at its Connect 2024 event, which also include Shazam integration for music recognition. This all happens via the camera, speakers, and microphones built into the Ray-Ban Meta glasses, so you don't need to hold up your phone. Recommended Videos In my review of Ray-Ban Meta Smart Glasses, I was pleasantly surprised by the excellent photo quality. The ease of taking photos and recording videos hands-free is great. Initially, the AI assistant had to be summoned with a wake word, slowing down communication. To ask about something I was seeing with Ray-Ban Meta Smart Glasses, I had to precede questions with "look and," which always felt awkward. The new live AI adds an intelligent assistant that's always available to streamline interactions so the answers arrive quicker. Meta's live AI feature uses video to get a continuous visual feed. This isn't a privacy concern because the live AI and live translation features must be enabled in settings and activated to start a continuous listening and watching session. The live features can also be disabled in settings or with a voice command. The latest features are coming first to Early Access users for testing. Within weeks or months, depending on feedback, Meta will begin rolling out these exciting v11 updates to more Ray-Ban Meta Smart Glasses owners. Ray-Ban Meta Smart Glasses offer great value at $299 and keep getting better with each over-the-air update. Check out our guide to the best smart glasses if you're interested, but want to check out the competition. As of now, the Ray-Ban Smart Glasses remain the first products of its type to really see some mainstream interest.
[2]
Meta Adds Real Time AI Video to Its Smart Glasses
Ray-Ban's Meta smart glasses have rolled out several new upgrades, including real-time AI video capability, live language translation, and Shazam. The new features are included in the v11 software update for Ray-Ban's Meta smart glasses, which began rolling out on Monday. The company first teased live AI and real-time translation during its annual Meta Connect conference in September. Live AI adds video to Ray-Ban's Meta smart glasses and lets wearers ask the device questions about what it can see in front of them. Live AI allows the wearer to naturally converse with Meta's AI assistant while it continuously views their surroundings. For example, if a wearer is gardening, they would theoretically be able to ask Meta's AI what plant they're looking at. Alternatively, if the wearer is grocery shopping, they could ask Live AI for meal prep ideas based on the food they're looking at. Meta says users can use the live AI feature for roughly 30 minutes at a time on a full charge. The company says that live AI will eventually give useful suggestions to the wearer even before they ask. Meanwhile, the live translation update means that Ray-Ban's Meta smart glasses will be able to translate speech in real-time between English and either Spanish, French, or Italian. "When you're talking to someone speaking one of those three languages, you'll hear what they say in English through the glasses' open-ear speakers or viewed as transcripts on your phone, and vice versa," Meta writes in a press release. The wearer must download language pairs in advance and specify their own language as well as the language spoken by their conversation partner. Meta also added Shazam, an app that lets users identify songs, to the smart glasses. When a user hears a track, they can say, "Hey Meta, what is this song?" and Shazam will try to find the tune that's playing. The live AI and live translation features are currently limited to members of Meta's Early Access Program in the U.S. and Canada. Meanwhile, Shazam support on Ray-Ban's Meta smart glasses is available for all users in the U.S. and Canada. In a news release, Meta warns that the new feature, especially live AI and live translation, might not always get things right. "We're continuing to learn what works best and improving the experience for everyone," the company writes.
[3]
Meta's Ray-Ban smart glasses are about to get one of the most impressive AI features yet
These affordable smart glasses just became an even better value Smart glasses have come a long way since the failed launch of Google Glass just over 10 years ago. Right now, the shining star among smart glasses is the Ray-Ban Meta Smart Glasses a well-performing pair that nearly earned a perfect score from Laptop Mag. These top-notch smart glasses just got even better, thanks to Meta's v11 software update, rolled out on Monday. With this update, Ray-Ban Meta glasses gained a feature first announced at Connect 2024: live translation. The live translation feature will allow Ray-Ban Meta glasses to "translate speech in real time between English and either Spanish, French, or Italian." The wearer will be able to both hear what they're saying through the glasses' speakers and see what they're saying via a transcript on a connected phone. Right now, live translation only supports the languages mentioned above, but as with many newly introduced features, it will likely gain more supported languages as time goes on. If you want to test out the live translation feature for yourself, here's how to do it. In order to test out the live translation feature on your Ray-Ban Meta glasses, you'll first need to be enrolled as an Early Access Program member, which is currently reserved for US- and Canada-based users. Then, you'll need to update your glasses to the newest v11 firmware and make sure the Meta View app is fully updated as well. If you don't see an update for either your glasses or your app, it's possible the gradual rollout may not have reached you just yet. Give it a day or two, and check again. Once you've downloaded and installed all updates, head to your Meta View app, tap the settings gear icon in the bottom right, and select Live translation from the menu. Download the language pairing you need, choosing between any combination of English, Spanish, French, and Italian. For the purposes of this short how-to, we'll use the "They Speak: Spanish and You Speak: English" language pairing as our example. After the languages are installed on your glasses, there are two ways to start using live translation. You can open the Meta View app, find Live translation in settings, and tap Start. Or, you can simply say, "Hey Meta, start live translation." When you hear a chime, you'll know live translation has started. As you're chatting with someone speaking Spanish, you'll automatically hear live translation to English in your ear. You can also take a look at the visual transcript in Meta View to help make sure you're not missing anything important. To stop, you can either say "Hey Meta, stop," or tap Stop in the Meta View app. The most widely used scenario this feature will be helpful in is facilitating conversations while traveling between you and local residents who speak Spanish, French, or Italian, or even between you and a close friend or family member who have a bit of a language barrier. However, it could also improve the language learning experience for English speakers wanting to learn Spanish, French, Italian, or any future languages Meta adds support for. The v11 firmware update also brings two other features to Ray-Ban Meta glasses to test: live AI and Shazam integration. Meta also teases additional updates slated for 2025 and "maybe some surprises."
[4]
Meta updates its smart glasses with real-time AI video
Meta's Ray-Ban Meta smart glasses are getting several new AI-powered upgrades, including the ability to have an ongoing conversation and translate between languages. Ray-Ban Meta owners in Meta's early access program for the U.S. and Canada can now download firmware v11, which adds "live AI." First unveiled this fall, live AI lets wearers continuously converse with Meta's AI assistant, Meta AI, to reference things they discussed earlier in the conversation. Without having to say the "Hey Meta" wakeword, wearers can interrupt Meta AI to ask follow-up questions or change the topic. Live AI also works with real-time video. Wearers can ask questions about what they're seeing in real time -- for example, what's around their neighborhood. Real-time AI video for Ray-Ban Meta was a significant focus of Meta's Connect dev conference early this fall. Positioned as an answer to OpenAI's Advanced Voice Mode with Vision and Google's Project Astra, the tech allows Meta's AI to answer questions about what's in view of the glasses' front-facing camera. With Monday's update, Meta becomes one of the first tech giants to market with real-time AI video on smart glasses. Google recently said it plans to sell AR glasses with similar capabilities, but the company hasn't committed to a concrete timeline. Meta claims that, in the future, live AI will even give "useful suggestions" before a wearer asks. What sort of suggestions? The company wouldn't say. Firmware v11 also introduces live translation, which enables Ray-Ban Meta wearers to translate real-time speech between English and Spanish, French, or Italian. When a wearer is talking to someone speaking one of those languages, they'll hear what the speaker says in English through the glasses' open-ear speakers and get a transcript on their paired phone. Ray-Ban Meta also has Shazam support as of firmware v11. Wearers can say "Hey Meta, Shazam this song" to have the glasses try to find the currently-playing tune. Meta warns that the new feature, in particular live AI and live translation, might not always get things right. "We're continuing to learn what works best and improving the experience for everyone," the company wrote in a blog post.
[5]
Your Meta Ray-Ban smart glasses just got a massive AI overhaul
The biggest Ray-Ban update yet is here, and it makes the smart glasses more useful than ever. Meta's Ray-Ban glasses are getting smarter thanks to an update that's introducing three features, including new AI functionality. In an announcement this week, Meta said the features are coming soon to its smart glasses. The first two are only available to Early Access program members (you can sign up here), but the third is available to all US and Canadian users. Also: Best smart glasses First up is live AI. Meta's glasses already have AI capabilities, but this latest update adds video support. During a live AI session, Meta explains that your glasses can see what you're seeing and continue a conversation more naturally than ever. Also: The 7 tech gadgets I couldn't live without in 2024 If you're cooking, for example, you might ask, "How much salt did you say I needed for that stew recipe?" or "Do you see a substitute for butter, and how much would I need?" If you're in a new neighborhood, you might ask, "What's the story behind this landmark?" or "Was this landmark built at the same time as the last one?" Meta also offered the example of wearing the glasses in your garden and asking, "How often do these plants need to be watered?" or "Where's a good place to put this plant?" You can ask questions without using the "Hey Meta" introduction each time, reference something you discussed earlier, ask a follow-up question, or even change the topic. Meta says live AI will eventually be able to offer suggestions before you even ask. After teasing the feature earlier this year at Connect, Meta is rolling out live translation. Your glasses can translate speech in real time between English and Spanish, French, or Italian. If you're listening to someone speak in one of those languages, you'll hear an English translation through your glasses' speakers or see it as a transcript on your phone. According to an official support page, you'll need to download the languages you want to translate the first time you use this feature, and you'll need to select the language you want to translate. It doesn't detect automatically like Google Translate. Also: Meta Ray-Ban smart glasses review Finally, your Meta Ray-Ban glasses can now use Shazam to identify a song that's playing nearby. Just ask, "Hey Meta, what is this song?" and you'll get your answer hands-free.
[6]
Your Meta Ray-Ban smart glasses just got a massive AI upgrade
The biggest Ray-Ban update yet is here, and it makes the smart glasses more useful than ever. Meta's Ray-Ban glasses are getting smarter thanks to an update that's introducing three features, including new AI functionality. In an announcement this week, Meta said the features are coming soon to its smart glasses. The first two are only available to Early Access program members (you can sign up here), but the third is available to all US and Canadian users. Also: Best smart glasses First up is live AI. Meta's glasses already have AI capabilities, but this latest update adds video support. During a live AI session, Meta explains that your glasses can see what you're seeing and continue a conversation more naturally than ever. Also: The 7 tech gadgets I couldn't live without in 2024 If you're cooking, for example, you might ask, "How much salt did you say I needed for that stew recipe?" or "Do you see a substitute for butter, and how much would I need?" If you're in a new neighborhood, you might ask, "What's the story behind this landmark?" or "Was this landmark built at the same time as the last one?" Meta also offered the example of wearing the glasses in your garden and asking, "How often do these plants need to be watered?" or "Where's a good place to put this plant?" You can ask questions without using the "Hey Meta" introduction each time, reference something you discussed earlier, ask a follow-up question, or even change the topic. Meta says live AI will eventually be able to offer suggestions before you even ask. After teasing the feature earlier this year at Connect, Meta is rolling out live translation. Your glasses can translate speech in real time between English and Spanish, French, or Italian. If you're listening to someone speak in one of those languages, you'll hear an English translation through your glasses' speakers or see it as a transcript on your phone. According to an official support page, you'll need to download the languages you want to translate the first time you use this feature, and you'll need to select the language you want to translate. It doesn't detect automatically like Google Translate. Also: Meta Ray-Ban smart glasses review Finally, your Meta Ray-Ban glasses can now use Shazam to identify a song that's playing nearby. Just ask, "Hey Meta, what is this song?" and you'll get your answer hands-free.
[7]
Meta's Smart Glasses Gain Live AI and Live Translation
Meta today added new features to its Ray-Ban smart glasses, including live translation and live AI. With live AI, the Meta smart glasses are able to see whatever the wearer sees thanks to the built-in camera, and can hold real-time conversations. According to Meta, the glasses are able to provide hands-free help with meal prep, gardening, exploring a new neighborhood, and more. Questions can be asked without the need to say the "Hey Meta" wake word, and the AI can understand context between requests for referencing prior queries. Meta says that eventually, the AI will be able to "give useful suggestions before you even ask." Along with live AI, there's now a new live translation feature that can translate in real-time between English and either Spanish, French, or Italian. When someone is speaking in one of those three languages, the glasses will translate what they say into English through the speakers or on a connected smart phone, and vice versa. The Meta glasses are now able to use Shazam to identify songs, so if you ask "Hey Meta, what is this song?" Shazam can provide the song title. Shazam is an Apple-owned company now, and is heavily integrated into iOS. All of these features are part of the Early Access Program open to any Meta glasses wearer. Sign-ups are available on Meta's website, though there are a limited number of slots for customers in the United States and Canada. Meta's smart glasses have seen a good amount of consumer interest, and rumors suggest that Apple might be getting into the smart glasses market with a similar device. Back in October, Bloomberg's Mark Gurman said that Apple is considering a pair of smart glasses that are comparable to the Meta Ray-Bans, offering Siri support, integrated cameras, and more.
[8]
Best Ray-Ban Meta smart glasses update yet adds Live AI tools in early access
Meta is rounding out the year with a major update to its Ray-Ban smart glasses with two Live features it teased at Meta Connect 2024. It's also adding Shazam integration to help you find the names of tunes you hear while wearing your specs. The only downside of the awesome-sounding Live features are that they're in early access, so expect them to be less reliable than your typical AI tools. They'll also only be available to Early Access Program members in the US and Canada. You can enroll at Meta's official site. But if you are in the Early Access Program you can now try Live AI and Live Translation. Live AI is like a video version of Look and Ask. Instead of taking a quick snap, your glasses will continually record your view so you can converse with it with about what you can see - or other topics. What's more, while in a Live AI session you won't need to say "Hey Meta" over and over again. Meta adds that "Eventually live AI will, at the right moment, give useful suggestions even before you ask." So be prepared for the AI to butt in with ideas without you prompting it directly. Live Translation is another real-time AI tool. This time it allows the AI to automatically translate between English and either Spanish, French, or Italian. When you're speaking to someone who is using one of those three languages you'll hear what they say in English through the glasses' open-ear speakers, or see it as a transcript on your phone - and they'll be able to hear or read a translation of what you're saying in their language. Thankfully, the update isn't all about just early access features. If you're out at an end-of-year party and like the sound of a tune you can also ask your glasses "Hey Meta, Shazam this song," and it will tell you what song is playing via the Shazam music recognition tool. Unfortunately, while this feature is available more widely it is once again only available in the US and Canada - so folks in the UK and beyond won't have access to it yet.
[9]
Ray-Ban's Meta glasses can now chat with you about your surroundings
The Ray-Ban Meta smart glasses are an interesting gadget. They started out very low key, with the ability to listen to audio, as well as take photos and videos, but also with most of the "smart" features missing. But Meta has slowly been adding more features to the mix, and now we've reached a point in which the Ray-Ban Metas truly deserve the "smart" moniker. On Monday, Meta announced several new features for Ray-Ban Meta glasses. One, currently enabled only for members of Meta's Early Access program, is live AI. It enables Meta's AI to see what you see (through the glasses' camera) and chat with you about your surroundings. For example, you could ask the AI what next to add to a cake you're baking, or how to improve your houseplant's health. Also available to Early Access program members is live translation, which is one of the coolest use cases for a gadget like this. If you're chatting with someone in a foreign language (English, Spanish, French, and Italian are supported) you can have the glasses translate what the other person is saying and feed the translation to the glasses' speakers, or as a transcript on your phone. Another new feature, available to all users from the U.S. and Canada, is Shazam integration. When there's music playing near you, you can say, "Hey Meta, what is this song?" and Shazam will find out the answer for you. Takes some of the fun out for music nerds like me, but it sounds like a pretty natural and elegant way to use Shazam. Meta says more software updates (including some surprises) are coming in 2025.
[10]
Meta is rolling out live AI and Shazam integration to its smart glasses
The already worked well as a head-mounted camera and pair of open-ear headphones, but now Meta is with access to live AI without the need for a wake word, live translation between several different languages, and access to Shazam for identifying music. Meta most of these features at in September. Live AI lets you start a "live session" with Meta AI that gives the assistant access to whatever you're seeing and lets you ask questions without having to say "Hey Meta." If you need your hands-free to cook or fix something, Live AI is supposed to keep your smart glasses useful even if you need to concentrate on whatever you're doing. Live translation lets your smart glasses translate between English and either French, Italian, or Spanish. If live translation is enabled and someone speaks to you in one of the selected languages, you'll hear whatever they're saying in English through the smart glasses' speakers or as a typed transcript in the Meta View app. You'll have to download specific models to translate between each language, and live translation needs to be enabled before it'll actually act as an interpreter, but it does seem more natural than holding out your phone to translate something. With Shazam integration, your Meta smart glasses will also be able to identify whatever song you hear playing around you. A simple "Meta, what is this song" will get the smart glasses' microphones to figure out whatever you're listening to, just like using Shazam on your smartphone. All three updates baby-step the wearable towards Meta's end goal of a true pair of augmented reality glasses that can replace your smartphone, an idea its is a real-life preview of. Pairing AI and either VR and AR seems to be an idea multiple tech giants are circling, too. Google's newest XR platform, , is built around the idea that a generative AI like Gemini could be the glue that makes VR or AR compelling. We're still years away from any company being willing to actually alter your field of view with holographic images, but in the meantime smart glasses seem like a moderately useful stopgap. All Ray-Ban Meta Smart Glasses owners will be able to enjoy Shazam integration as part of Meta's v11 update. For live translation and live AI, you'll need to be a part of Meta's Early Access Program, which you can join right now .
[11]
Meta is putting Shazam and AI into its Ray-Ban smart glasses -- here's what's new
Early-access users in North America are getting a big upgrade We were rather impressed with the second-generation Meta Ray-Ban smart glasses when we reviewed them, but thanks to Meta AI they've got steadily better over time. Even something as mundane as grocery shopping can be transformed with the AI implementation if you're wearing them. But the latest update is the most significant yet. Teased at Meta Connect back in September, three new updates have arrived at the same time, making the glasses all the more useful. However, two of the features are only available to those on the early access programme, and all three are currently for those in the U.S. and Canada only. As detailed on the Meta blog, the three upgrades arrive with the v11 software update. The one that's available outside of early access is Shazam integration. If you somehow hear a track you want to hear more of amidst all the Christmas muzak that's currently on repeat, you can simply say "Hey Meta, what is this song?" and your glasses will use the microphone to listen and come up with the answer for you to stream at your leisure. Again, this is only available in North America, so if you're elsewhere, you'll just have to rely on the Android and iOS apps. Then there are the two extras that require you to be a member of the Early Access programme. Live AI is the first of these, adding video to Meta AI on your glasses. When activated, Meta AI can now "see" what you're looking at, and converse naturally about what's going on before your very eyes. Meta thinks this will be hugely useful for activities when your hands are busy (think cooking or gardening), or just when out and about. You'll be able to ask Meta AI questions about what you're looking at, for example how you can make a meal out of a bunch of ingredients in front of you. There will be battery drain though, with Meta suggesting you'll get around half an hour of live AI use on a full charge. Still, it's an exciting development, and one that Meta teases will improve over time: "Eventually live AI will, at the right moment, give useful suggestions even before you ask," the post reads. Finally and, to me, the most exciting, is live translation which promises to let you understand foreign languages without ever attempting to learn them. When enabled, if someone is talking to/at you in French, Italian or Spanish, you'll get a real-time translation provided for you through the open-ear speakers or as text on your phone. We've seen this kind of thing done before, of course. The first-generation Pixel Buds were attempting this seven years ago, and Samsung's Galaxy AI does something similar with live phone calls. But this feels a bit more natural than both given Meta's Ray-Bans are designed to keep the tech largely invisible. Again, these last two AI features require you to be part of the early-access programme. If you're in North America, you can sign up here, though it does describe the process as joining a waitlist, implying acceptance isn't guaranteed. Still, these will roll out to all users eventually, and Meta has hinted that more will be coming soon. "We'll be back with more software updates -- and maybe some surprises -- in 2025," the post concludes, cryptically.
[12]
Ray-Ban Meta Smart Glasses Can Now Hold a Real-Time Conversation
Live translation will allow Ray-Ban Meta to translate speech in real-time Ray-Ban Meta smart glasses received two new artificial intelligence (AI) features on Monday. The first is Live AI which adds real-time video processing capability to Meta AI and lets the chatbot continuously see the user's surroundings and answer questions about them. The second is live translation which lets the AI translate speech in real-time within the supported languages. The latter was also demonstrated by CEO Mark Zuckerberg during Connect 2024. These are first being rolled out to the members of Meta's Early Access Programme in Canada and the US. The tech giant says that the two new AI features coming to the Ray-Ban Meta smart glasses are part of the v11 software update for the smart glasses that is now rolling out to eligible devices. Live AI will let Meta AI access the cameras in the smart glasses and continuously process the video feed in real time. This is similar to ChatGPT's Advanced Voice with Vision feature recently released by OpenAI. The company highlighted that during a session, the AI can continuously see what the user sees, and converse about it more naturally. Users will be able to talk to Meta AI without saying the "Hey Meta" activation phrase. Additionally, users can also ask follow-up questions as well as reference things discussed earlier in the session, according to the company. They can also change the topic and go back to previous topics fluidly. "Eventually Live AI will, at the right moment, give useful suggestions even before you ask," the post added. Live translation offers real-time speech translation between English and either Spanish, French, or Italian languages. So, if a user is talking to someone who speaks one of those three languages, Meta AI can translate them in real time and generate the translated audio through the glasses' open-ear speakers. Users can also view the translation on their smartphone as transcription. The tech giant cautions that these new features may not always get it right and that it will continue to take user feedback and improve the AI features. Currently, there is no word on when these features will be released for all users globally. Meta has yet to release any of these AI features in India.
[13]
Meta adds live translation, AI video to Ray-Ban smart glasses
(Reuters) - Meta Platforms said on Monday it has updated the Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality. The Facebook parent, which first announced the features during its annual Connect conference in September, said the update is available for members that are part of its "Early Access Program". The features are included in the v11 software update, which will begin rolling out on Monday. The latest update adds video to Meta's AI chatbot assistant, which allows the Ray-Ban smart glasses to process what the user is seeing and respond to questions in real-time. The smart glasses will now be able to translate speech in real time between English and Spanish, French or Italian. "When you're talking to someone speaking one of those three languages, you'll hear what they say in English through the glasses' open-ear speakers or viewed as transcripts on your phone, and vice versa," Meta said in a blog. Meta also added Shazam, an app that lets users identify songs, to the smart glasses, which will be available in the U.S. and Canada. In September, Meta said it is updating the Ray-Ban smart glasses with several new AI features, including tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands. (Reporting by Harshita Mary Varghese in Bengaluru; Editing by Vijay Kishore)
[14]
Meta Details Sweet New AI Features for Ray-Ban Glasses
We may earn a commission when you click links to retailers and purchase goods. More info. There is absolutely no doubt that there are a few people getting Ray-Ban's Meta smart glasses for the holidays. If you think you might be receiving a pair or are gifting one of our current favorite pieces of technology, you'll be intrigued to learn about the following new features. Meta announced this week that via the Early Access Program, new features are becoming available. The two big ones are Live AI and Live Translation. With Live AI, Meta AI has access to a continuous stream of video from your glasses, allowing it to have more natural conversations with the user, as well as do a variety of tasks. You can now get, "real-time help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood." For Live Translation, users can utilize AI to translate in real time a conversation taking place in another language. This is obviously helpful if you're attempting to communicate with someone speaking a language you don't know. Could it also be useful for listening in on someone else's conversations? Possibly, but we'll have to test that out. Shazam integration is also live, meaning that if you're out in the wild and hear a sweet tune but aren't sure about the title or artist, simply say, "Hey Meta, what is this song?" and it's taken care of. If your glasses aren't yet enrolled in the Early Access Program, click the link below. All you'll need is your serial number. If you want to check out an alternative to Meta and Ray-Ban, we actually wrote up a Gemini-powered pair of smart glasses just last week that are currently on Kickstarter. There are different options out there.
[15]
Meta adds live translation, AI video to Ray-Ban smart glasses
Dec 16 (Reuters) - Meta Platforms (META.O), opens new tab said on Monday it has updated the Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality. The Facebook parent, which first announced the features during its annual Connect conference in September, said the update is available for members that are part of its "Early Access Program". The features are included in the v11 software update, which will begin rolling out on Monday. The latest update adds video to Meta's AI chatbot assistant, which allows the Ray-Ban smart glasses to process what the user is seeing and respond to questions in real-time. The smart glasses will now be able to translate speech in real time between English and Spanish, French or Italian. "When you're talking to someone speaking one of those three languages, you'll hear what they say in English through the glasses' open-ear speakers or viewed as transcripts on your phone, and vice versa," Meta said in a blog. Meta also added Shazam, an app that lets users identify songs, to the smart glasses, which will be available in the U.S. and Canada. In September, Meta said it is updating the Ray-Ban smart glasses with several new AI features, including tools for setting reminders and the ability to scan QR codes and phone numbers using voice commands. Reporting by Harshita Mary Varghese in Bengaluru; Editing by Vijay Kishore Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Disrupted
[16]
Meta's Ray-Bans Can Now Do Real-Time Live AI And Translation
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps Meta's camera-equipped Ray-Bans have had a lot of AI features onboard already, but they're getting two more big ones. Always-on continuous AI assistance is arriving onto the glasses starting today for owners who have early access to Meta's features, as well as onboard translation. Both these features were demoed by Mark Zuckerberg at Meta's Connect developer conference earlier this year. The features involve continuous audio and camera recording as opposed to specific individual prompts, allowing the Ray-Bans to be used for an extended period of time with the AI features turned on. Saying "Meta, start Live AI" begins the always-on feature. The glasses' LED light stays on when always-on live AI is activated, and Meta keeps a recording of the conversation that can be referred to throughout the AI session. For translation, it should work automatically while talking, with the translation coming in via the glasses with a slight delay. The live AI assistance is similar to what Google just demonstrated on its own prototype glasses this month via Gemini and Android XR, arriving next year. The always-on AI takes a hit on battery life: expect 30 minutes of use before needing a recharge, according to Meta. But this type of always-on camera-assisted AI is exactly what more tech companies are going to be exploring in the next year. I'll follow up with impressions when I get a chance to test it for myself.
[17]
Ray-Ban Meta smart glasses will soon identify songs with Shazam - 9to5Mac
Meta this week announced new features coming to its smart glasses designed in partnership with Ray-Ban. Ray-Ban Meta glasses will soon gain new AI capabilities as well as integration with Apple's Shazam to identify songs. As announced by the company, software update v11 for the Ray-Ban Meta adds integration with Shazam, Apple's song identification app. Once available, users will be able to simply use their voice to say "Hey Meta, what is this song?" and the glasses will use Shazam to recognize the song and answer the question. "We all know the feeling: You're out on the town when an absolute banger starts playing -- but either it's new, obscure, or even an old favorite whose track name or artist just happens to escape you at that particular moment. Now, your glasses can do the heavy lifting for you," Meta said in a blog post. This comes after Meta added Apple Music integration to its Ray-Ban smart glasses earlier this year. With this integration, those who own the glasses can ask Meta's virtual assistant to play a song, playlist, album, station, or even artist - all hands-free. In addition to Shazam integration, Ray-Ban Meta glasses will also receive some new AI features with the update. One of them is Live AI, which will let users share what they're seeing with their glasses in real time so that Meta AI can help them with everyday activities. With Live AI, users can ask questions without having to say "Hey Meta" all the time. In addition, Meta is also bringing Live Translation to its smart glasses. When talking to someone in another language, you'll hear what they've said in your language through the glasses' speakers. The feature was teased live earlier this year by Meta CEO Mark Zuckerberg. According to Meta, the v11 software update will begin rolling out starting today to Ray-Ban Meta glasses users. However, the AI features will only be available in beta for those registered in Meta's Early Access Program. If you own a Ray-Ban Meta, here's how to upgrade your smart glasses: Make sure your glasses are nearby, paired to your phone, and recharged before installing an update.
[18]
Ray-Ban Meta Glasses Can Now Identify Songs On-the-Go With Shazam
A previous update brought Apple Music with voice-to-search support Meta Platforms on Monday announced several new features for the Ray-Ban Meta Glasses. While the Live AI with real-time video processing ability and live translation in real-time within the supported languages were the standout additions powered by artificial intelligence (AI), the company also introduced integration with Shazam -- Apple's music identification app -- in select global regions. This enables Ray-Ban Meta Glass users to identify songs on the go via voice prompts. Meta detailed the new features arriving on the Ray-Ban Meta Glasses in a newsroom post. The Shazam integration is being rolled out as part of the v11 software update for the smart glasses that is now rolling out to eligible devices. However, it is currently limited to Canada and the US. It offers hands-free music recognition via voice prompts. Users can ask, "Hey Meta, what is this song?" and the Ray-Ban Meta Glasses will identify the song using Shazam. This feature is said to come in handy in situations when a great track is playing, such as in a store or a cafe, helping users know the track or artist's name and not miss out. Notably, the company introduced support for Apple Music earlier this year, adding the ability to stream music via the Apple app without touching the phone. It leverages the wearable's voice-to-search functionality to play a song, playlist, album, station, or even artist. In addition to the Shazam integration, Meta also rolled out a Live AI feature. Similar to ChatGPT's Advanced Voice With Vision, it grants Meta AI access to Ray-Ban Meta Glasses' cameras to monitor the video feed in real-time. The chatbot can continuously see the user's surroundings and answer questions about them. Users can invoke the Meta AI without the "Hey Meta" command and can even ask follow-up questions. Further, live translation has been added to the smart glasses. It offers real-time speech translation between English and either Spanish, French, or Italian languages. Users can play the translated audio through the open-ear speakers and even get a transcription of it.
[19]
Meta's Ray-Ban smart glasses adding 'Live AI' that works like Google's Project Astra
Meta has announced a new software update for its Ray-Ban smart glasses which will add "Live AI," a feature that can use a video feed to gather context for questions, similarly to Google's Project Astra. A new update rolling out to Ray-Ban Meta smart glasses, v11, which brings a few new options. That includes Shazam integraiton, which will allow users to ask the glasses "Hey Meta, what is this song" and then have the result read aloud. This feature will be available in the US and Canada. Beyond that, Meta is also introducing new AI features, and they look enticing. The first of these new features is "Live AI," which allows Ray-Ban Meta glasses to capture video which is then used by the AI to offer "real-time, hands-free help" on the things you're actively doing. Meta says that, eventually, this data will be used to offer suggestions before you even have to ask. The first is live AI, which adds video to Meta AI on your glasses. During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying "Hey Meta," reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask. "Live translation," meanwhile, will be able to translate speech in real-time, with the other person's speech being output in English through the glasses (and also transcribed on your phone). This works for Spanish, French, and Italian. Meta will only be rolling these features out through a waitlist, an only in the US and Canada for now. Google is working on something just like this. At Google I/O 2024 in May, the company showed off "Project Astra," a new AI project that would be able to use a video feed to gather context, then being able to answer questions based on what it saw. Google teased the functionality on glasses, but has yet to roll anything out. The announcement of Gemini 2.0 earlier this month saw Google detailing new updates to Astra will be able to converse in multiple languages, store up to 10 minutes of memory, improve latency, and more. It's unclear how Meta's "Live AI" will compare, but it's certainly exciting to see this functionality coming so soon, especially as we won't see it fully realized for Google until sometime next year.
[20]
Meta's Ray-Ban Smart Glasses Can Now Be Your Chef, Linguist, And DJ -- What's Next, Therapist? - Meta Platforms (NASDAQ:META)
On Monday, Meta Platforms Inc. META unveiled much-anticipated updates to its Ray-Ban smart glasses. What Happened: Meta rolled out three new features for its Ray-Ban smart glasses: live AI, live translations, and Shazam. These updates are part of an effort to improve user experience by embedding advanced technology into daily-use eyewear. The live AI and translation features are currently exclusive to members of the Early Access Program. The live AI function allows users to engage with Meta's AI assistant, which can offer suggestions based on the user's environment, such as recipe ideas while shopping. See Also: Nancy Pelosi's Stock Pick Broadcom Shoots 15% In Friday Pre Market As Broadcom CEO Sees 'Opportunity Over The Next 3 Years In AI' The live translation feature provides real-time speech translation between English and Spanish, French, or Italian. Users can listen to translations through the glasses or view them on their phones. However, language pairs must be downloaded beforehand, and users need to specify the languages spoken by themselves and their conversation partners. Shazam support is available to all users in the U.S. and Canada. By prompting the Meta AI, users can identify songs they hear. To access these features, users must ensure their glasses are updated to the v11 software and the Meta View app is on v196. Subscribe to the Benzinga Tech Trends newsletter to get all the latest tech developments delivered to your inbox. Why It Matters: Launched in October last year, the Ray-Ban Meta smart glasses let users stream live videos directly to their Facebook and Instagram followers. Equipped with Meta AI, these glasses can provide details about the objects within the wearer's view. In October 2024, it was reported that Meta's smart glasses have become the top-selling product in 60% of Ray-Ban stores across Europe, the Middle East, and Africa. However, Meta faces competition from several new competitors including Solos, a smart glasses manufacturer. Earlier it was reported that Solos planned to launch the AirGo Vision, powered by OpenAI's ChatGPT. Image Credits - Shutterstock Check out more of Benzinga's Consumer Tech coverage by following this link. Read Next: Apple's 2025 Bull Case: Can It Deliver Without Riding The AI Wave? Disclaimer: This content was partially produced with the help of Benzinga Neuro and was reviewed and published by Benzinga editors. Market News and Data brought to you by Benzinga APIs
[21]
Ray-Ban Meta glasses' new features will make you feel like Tony Stark
Ray-Ban's partnership with Meta has always promised a fusion of fashion and technology, but with their latest update, the Ray-Ban Meta glasses are stepping into new territory with cutting-edge AI features. The glasses are no longer just about style -- they now come packed with an impressive range of tools designed to make daily life easier and more immersive. At the heart of the new functionality is Meta AI, which allows the glasses to understand and interact with the world around you in real-time. With live AI assistance, wearers can ask the glasses questions without needing to say the usual "Hey Meta" wake-up command. Whether you need help with directions, information about objects you're looking at, or personalized suggestions for your day, these glasses will provide immediate answers. But it doesn't stop there. One of the standout features is the live translation tool. Now, when conversing with someone who speaks English, Spanish, French, or Italian, the glasses will translate their speech into your preferred language in real-time, either through audio or a text transcription on your phone. This is a game-changer for travelers or anyone looking to bridge language gaps effortlessly. In another exciting addition, the glasses are now equipped with Shazam integration. With this feature, you can instantly identify songs playing around you, whether you're in a café, at a party, or even walking down the street. The glasses will pick up the song, and you'll know exactly what's playing, all without pulling out your phone. However, while these features offer a seamless and hands-free experience, they also come with some potential concerns. Meta's new upgrades are designed to make your interactions with the world smarter, but the constant data collection and real-time monitoring have raised questions about privacy. Just how much of your life are you willing to share with these AI-powered glasses? Currently available in the U.S. and Canada, the new features are part of an early access program for existing Ray-Ban Meta owners. But as Meta pushes the boundaries of what's possible with wearables, it raises an important question: Are we ready for smart glasses that know more about us than we do?
[22]
Meta rolls out live AI, live translations, and Shazam to its smart glasses
Both live AI and live translation were first teased at Meta Connect 2024 earlier this year. Live AI allows you to naturally converse with Meta's AI assistant while it continuously views your surroundings. For example, if you're perusing the produce section at a grocery store, you'll theoretically be able to ask Meta's AI to suggest some recipes based on the ingredients you're looking at. Meta says users will be able to use the live AI feature for roughly 30 minutes at a time on a full charge.
Share
Share
Copy Link
Meta rolls out significant AI-powered updates to Ray-Ban Meta Smart Glasses, including real-time visual AI, live translation, and Shazam integration, enhancing user experience and functionality.
Meta has unveiled a significant update (v11) for its Ray-Ban Meta Smart Glasses, introducing cutting-edge AI-powered features that promise to revolutionize the user experience. This update marks a major leap forward in wearable technology, bringing real-time visual AI, live translation, and music recognition capabilities to these stylish smart glasses 12.
The most notable addition is the real-time visual AI feature, which allows users to have continuous conversations with Meta's AI assistant about their surroundings. This groundbreaking functionality enables wearers to ask questions about what they're seeing without the need for wake words, creating a more natural and seamless interaction 34.
For instance, users can inquire about landmarks while exploring a new neighborhood or seek gardening advice while tending to plants. The AI can maintain context throughout the conversation, allowing for follow-up questions and topic changes without interruption 5.
Another impressive feature introduced in this update is live translation. The Ray-Ban Meta Smart Glasses can now translate speech in real-time between English and either Spanish, French, or Italian. Users can hear translations through the glasses' speakers or view transcripts on their connected smartphones, facilitating smoother communication across language barriers 23.
The v11 update also brings Shazam integration to the smart glasses. Users can now identify songs playing in their vicinity by simply asking, "Hey Meta, what is this song?" This hands-free music recognition feature adds another layer of convenience to the glasses' functionality 15.
Currently, the live AI and translation features are limited to members of Meta's Early Access Program in the US and Canada. However, the Shazam integration is available to all users in these regions. Meta acknowledges that these new features, especially live AI and translation, may not always be perfect and states that they are continuously working to improve the experience 24.
Meta reports that users can utilize the live AI feature for approximately 30 minutes on a full charge. The company hints at future improvements, suggesting that the AI may eventually offer proactive suggestions even before users ask questions 45.
This update significantly enhances the value proposition of the Ray-Ban Meta Smart Glasses, which are priced at $299. As one of the first mainstream smart glasses to offer real-time AI video capabilities, Meta is positioning itself at the forefront of this emerging technology sector 13.
With these advancements, Meta is not only improving the functionality of its smart glasses but also paving the way for more immersive and intelligent wearable devices in the future. As the technology continues to evolve, we can expect even more innovative features and applications in the coming years.
Reference
[1]
[2]
[3]
[4]
Meta has announced a range of new AI-driven features for its Ray-Ban smart glasses, including live translation, multi-modal AI, and enhanced video capabilities. These updates aim to make the glasses more versatile and useful in everyday life.
16 Sources
16 Sources
Meta's Ray-Ban smart glasses receive a significant AI update, introducing multimodal features that enhance user interaction and functionality, potentially revolutionizing the smart glasses market.
3 Sources
3 Sources
Ray-Ban Meta smart glasses are outselling traditional Ray-Bans in many stores, with new AI features rolling out globally. The glasses' success has led to an extended partnership between Meta and EssilorLuxottica.
4 Sources
4 Sources
Meta launches its AI-powered Ray-Ban smart glasses in several European countries, introducing multilingual support while navigating complex regulatory challenges.
4 Sources
4 Sources
Meta plans to incorporate small displays into Ray-Ban smart glasses by 2025, aiming to enhance functionality and challenge smartphone dominance. This move represents a significant step towards mixed reality technology in wearable devices.
10 Sources
10 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved