Curated by THEOUTPOST
On Thu, 24 Apr, 12:04 AM UTC
17 Sources
[1]
If you own Ray-Ban Meta glasses, you should double-check your privacy settings | TechCrunch
Meta has updated the privacy policy for its AI glasses, Ray-Ban Meta, giving the tech giant more power over what data it can store and use to train its AI models. The company emailed Ray-Ban Meta owners on Tuesday with a notice that AI features will now be enabled on the glasses by default, according to The Verge. This means Meta's AI will analyze photos and videos taken with the glasses while certain AI features are switched on. Meta will also store customers' voice recordings to improve its products, without an option to opt out. To be clear, Ray-Ban Meta glasses are not constantly recording and storing everything around the wearer. The device only stores speech that the user says after the "Hey Meta" wake word. Meta's privacy notice on voice services for wearables says that voice transcripts and recordings can be stored for "up to one year to help improve Meta's products." If a customer doesn't want Meta to train its AI on their voice, they will have to manually delete each recording from the Ray-Ban Meta companion app. The change in terms is along the lines of Amazon's recent policy change affecting Echo users. As of last month, Amazon will run all Echo commands through the cloud, removing the more privacy-friendly option to process voice data locally. Companies like Meta and Amazon are eager to hoard these heaps of voice recordings because they are useful training data for their generative AI products. With a wider range of audio recordings, Meta's AI can possibly do a better job at processing different accents, dialects, and patterns of speech. But improving its AI comes at the expense of user privacy. A user may not understand that if they use their Ray-Ban Meta glasses out of the box to photograph a loved one, that person's face may find its way into Meta's training data, for example. The AI models behind these products require obscene amounts of content, and it benefits companies to train their AI on the data that their users are already producing. Meta's hoarding of user data is not new. Already, Meta trains its Llama AI models on public posts that American users share on Facebook and Instagram.
[2]
'Hey Meta': New AI Features Come to Meta's Ray-Bans
Expertise Smart home | Smart security | Home tech | Energy savings | A/V As Google starts to revive its Google Glass concept, Meta is already a step ahead with new artificial intelligence functions coming to glasses this summer. The Ray-Ban smart glasses, in partnership with Meta, are getting several powerful AI updates for US and Canadian users. Operating the Meta View app on a connected smartphone, users of Ray-Ban smart glasses will also be able to use the "Hey Meta, start live AI" command to give Meta AI a live view of whatever they are seeing through their glasses. Similar to Google's Gemini demo, users will be able to ask Meta AI conversational questions about what it sees and how it might solve problems. Meta provided the example of Meta AI giving possible substitutes for butter based on what it sees when you look in the pantry. Even without live AI, you'll be able to ask specific questions about objects that you're looking at. In addition to new seasonal looks, Ray-Ban's smart glasses will be also able to use the "Hey Meta, start live translation" command to automatically translate incoming languages including English, French, Italian and Spanish. The glasses' speakers will translate as other people talk and you can hold up your phone so the other party can see a translated transcript too. Along with these AI upgrades, the smart glasses will be able to post automatically on Instagram or send a message on Messenger with the right voice commands. New compatibility with music streaming services will also allow you to play songs through Apple Music, Amazon Music and Spotify on your glasses in lieu of earbuds. Meta reports that the rollout of these new features will happen this spring and summer, along with object recognition updates for EU users coming next week. Meta and Ray-Ban didn't immediately respond to a request for further comment.
[3]
With 'Hey Meta,' Ray-Ban Wearers Will Unlock All-New AI Abilities -- and Privacy Concerns
Expertise Smart home | Smart security | Home tech | Energy savings | A/V As Google starts to revive its Google Glass concept, Meta is already a step ahead with new artificial intelligence functions coming to glasses this summer. The Ray-Ban smart glasses, in partnership with Meta, are getting several powerful AI updates for US and Canadian users. Operating the Meta View app on a connected smartphone, users of Ray-Ban smart glasses will also be able to use the "Hey Meta, start live AI" command to give Meta AI a live view of whatever they are seeing through their glasses. Similar to Google's Gemini demo, users will be able to ask Meta AI conversational questions about what it sees and how it might solve problems. Meta provided the example of Meta AI giving possible substitutes for butter based on what it sees when you look in the pantry. Even without live AI, you'll be able to ask specific questions about objects that you're looking at. In addition to new seasonal looks, Ray-Ban's smart glasses will be also able to use the "Hey Meta, start live translation" command to automatically translate incoming languages including English, French, Italian and Spanish. The glasses' speakers will translate as other people talk and you can hold up your phone so the other party can see a translated transcript too. When I reached Inna Tokarev Sela, CEO and Founder of AI data company illumex about privacy issues with smart glasses like these, she mentioned that in her own experience with Ray-Ban smart glasses, people usually reacted when they noticed the recording indicator light, which meant the glasses were watching. That can make some people uneasy, whether they are concerned about being filmed by a stranger or by what Meta may be doing with all that visual data it's collecting. "In the new models you can control the notification light, which could pose a privacy risk," Sela said, "But everyone films everyone all the time anyway at touristy landmarks, public events, etc. What I expect is that Meta will not divulge any information on anyone, unless they register and explicitly give their consent." This could lead to other consent headaches too, depending on if users are recording for other purposes. "For example, users should be able to opt in and choose the type of information to expose when they're in someone's frame -- similar to LinkedIn, for example," noted Sela. "Of course, any recording resulting from the glasses should not be admissible to use in a court of law, as with any other kind of recording, without explicit permission." Along with the AI upgrades, Ray-Ban's smart glasses will be able to post automatically on Instagram or send a message on Messenger with the right voice commands. New compatibility with music streaming services will also allow you to play songs through Apple Music, Amazon Music and Spotify on your glasses in lieu of earbuds. Meta reports that the rollout of these new features will happen this spring and summer, along with object recognition updates for EU users arriving in late April and early May. Meta and Ray-Ban didn't immediately respond to a request for further comment.
[4]
Meta tightens privacy policy around Ray-Ban glasses to boost AI training
Meta is making a few notable adjustments to the privacy policy for its Ray-Ban Meta smart glasses. In an email sent out on April 29th to owners of the glasses, the company outlined two key changes. First, it's giving Meta AI a more frequent view of the world. "Meta AI with camera use is always enabled on your glasses unless you turn off 'Hey Meta," the email said, referring to the hands-free voice command functionality. So unless you turn that convenience-minded feature off, Meta will frequently be analyzing whatever's captured by the built-in camera. If you simply want to use the Ray-Ban Metas as a "normal" camera without any artificial intelligence thrown in, you'll have to disable "Hey Meta" and stick to the physical controls. Second, Meta is taking after Amazon by no longer allowing Ray-Ban Meta owners to opt out of having their voice recordings stored in the cloud. "The option to disable voice recordings storage is no longer available, but you can delete recordings anytime in settings," the company wrote. In its voice privacy notice, Meta states that "voice transcripts and stored audio recordings are otherwise stored for up to one year to help improve Meta's products." If the company detects that a voice interaction was accidental, those recordings are deleted after a shorter 90-day window. The motivation behind these changes is clear: Meta wants to continue providing its AI models with heaps of data on which to train and improve subsequent results. Some users began noticing these policy changes in March, but at least in the United States, Meta says they went into effect as of April 29th. Earlier this month, the company rolled out a live translation feature to the Ray-Ban Meta product. And just yesterday, Meta rolled out a standalone Meta AI app on smartphones to more directly compete with Open AI's ChatGPT, Google Gemini, Anthropic's Claude, and other AI chatbots. The company is reportedly planning a higher-end pair of Ray-Ban Meta glasses for release later in 2025. The current glasses lineup starts at $299, but the more premium version could cost around $1,000. Meta is set to report its Q1 2025 earnings later on Wednesday, and the company is likely to address the tariff chaos that has roiled markets in recent months.
[5]
Meta expands AI access on Ray-Ban smart glasses in Europe
April 23 (Reuters) - Meta Platforms (META.O), opens new tab said on Wednesday it is expanding access to its artificial intelligence assistant, Meta AI, on Ray-Ban smart glasses to seven additional European countries. People in Germany, Austria, Belgium, Denmark, Norway, Sweden and Finland will now be able to interact with Meta AI using voice prompts to get answers to general questions, the Facebook and Instagram parent said. Meta launched its AI technology in Europe in March, a rollout that was initially announced in June last year but was delayed following regulatory concerns on data protection and privacy. While Meta AI was launched in the U.S. in 2023, its release in Europe faced several hurdles due to the European Union's stringent privacy and transparency rules. Sprucing up its wearable technology with AI capabilities could help Meta attract new users at a time when the company is investing billions of dollars in bolstering its AI infrastructure. Meta said the expansion will also include a live translation feature, which is being broadly rolled out in its markets. It will be releasing a feature, where people can ask Meta AI about the things they are looking at and get real-time responses, in supported countries in the EU starting next week. The company updated Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality in December 2024. Meta had first announced the features during its annual Connect conference in September last year. Reporting by Jaspreet Singh in Bengaluru Our Standards: The Thomson Reuters Trust Principles., opens new tab Suggested Topics:Artificial Intelligence
[6]
Meta is bringing smart glasses live translation and AI to more people
Meta AI, the most interesting thing you can do with Ray-Ban Meta glasses, will soon be available to more people. The company's Live Translation feature is rolling out to all the product's markets, and Live AI (where you can hold a free-flowing conversation about what you're looking at) will soon be available in the US and Canada. In addition, glasses owners in the EU can finally use Meta AI with their high-tech specs. Live translation, previously available in early access, is now rolling out in every region where Ray-Ban Meta glasses are available. Handy for trips abroad or chats with locals who speak a different language, the AI-powered feature speaks a translation in your preferred language in real time. You can also view a translated transcript on your phone. Live translation is available in English, French, Italian and Spanish. And if you download your preferred language pack in advance, you can use it without a Wi-Fi connection or even mobile data from your paired phone. You can launch the feature by saying, "Hey Meta, start live translation." US and Canadian users can now use Meta's Live AI feature, which lets you ask questions about your surroundings without saying "Hey Meta" every time. (You can even interrupt it.) Another feature previously only available in beta, live AI lets you chat with your glasses in natural language about your environment, asking it to explain things like missing ingredients for a meal or the best wine to pair with it. You can say, "Hey Meta, start live AI" to begin. In addition, Meta AI is finally rolling out to all of the product's supported countries in the European Union. And starting next week, EU countries will get the visual search feature that can answer individually prompted questions about your surroundings, but (unlike Live AI) can't perform a free-flowing conversation with interruptions. The glasses' Instagram integration is also expanding. Meta says you can soon send and receive Instagram DMs, photos, audio calls and video calls on your Ray-Bans. They already supported calls and messages through WhatsApp and Messenger and your phone's messaging app, so the glasses now have a solid list of communications options. You can start by saying, "Hey Meta, send a message to [your recipient's name] on Instagram." Music app support is expanding beyond the US and Canada. The company is rolling out support for Spotify, Amazon Music, Apple Music and Shazam in the product's non-North American regions. Once the update is live, you can ask your glasses things like, "Hey Meta, what's the name of this song?" or "Hey Meta, when did this album come out?" Although no major hardware upgrades were announced today (the next revision with a screen is rumored to launch later this year), Meta and Ray-Ban are rolling out new styles for the second-gen glasses. These include new Skyler frame and lens color combinations, including the cat-eye-shaped Shiny Chalky Gray with Transitions Sapphire lenses and the "more timeless" Skyler Shiny Black with G15 Green lenses and Skyler Shiny Black with Clear lenses.
[7]
Meta Is Turning Its Ray-Bans Into a Surveillance Machine for AI
What you see is what Meta AI sees while wearing the Ray-Ban Meta smart glasses, and your opt-out options are getting narrower and narrower. In a recent update to the privacy policy for the device, received by most owners in an email sent April 29, according to The Verge, Meta opened up the ability to collect more data for use to train its artificial intelligence models. Under the new policies, Meta explains that “Meta AI with camera use is always enabled on your glasses unless you turn off 'Hey Meta,'†the activation phrase used to communicate with the company's AI assistant. Wake words or phrases like this are common for AI devices, with the trade-off being that they are technically always on and waiting to be activated. Having the assistant always waiting to hear you give it a task removes some of the friction to functionality, but it also opens up the uncomfortable reality that these devices may be collecting information even when you aren't thinking about it. In this case, if you keep the "Hey Meta" feature active, Meta can use any images it captures through the lens of the built-in camera. Meta says the camera is not always recording, so this only applies to photos or videos the user captures with the device. Additionally, Meta's latest update removes the ability for users to keep their voice recordings from being stored on Meta's servers. Instead, users will have to manually delete every recording if they would like to cut off Meta's access before the recordings expire. “The option to disable voice recordings storage is no longer available, but you can delete recordings anytime in settings,†the company's policy now reads. Per Meta's voice privacy notice, the company will store voice transcripts and audio recordings "for up to one year to help improve Meta’s products." Accidental voice interactions are kept for 90 days. Gizmodo has reached out to Meta for comment on these changes, but we did not hear back at the time of publication. The motivation for all of this is pretty clear: more data to feed the AI machine. Meta just rolled out its live translation feature on the Ray-Ban smartglasses that provide real-time translations between several supported languages, including French, Italian, Spanish, and English. It also just recently launched a standalone Meta AI app. It's clear the company is all-in on AI right now, and that means it needs all the data it can get to keep fine-tuning things, especially after it got caught allegedly fudging the numbers on benchmark tests. This is the inevitable direction that devices with mics and cameras installed will go. At some point, the companies that make the devices will decide that what they can capture is more valuable than any semblance of privacy. They'll flip the switch, and the glasses on your face or the speaker in your home will turn into a surveillance device.
[8]
Meta's Ray-Ban glasses feed AI while your privacy fades by default
Ray-Ban Meta Wayfarer glasses - Matte Jeans Transparent edition Big Tech's shift toward AI dominance is leaving privacy behind. Just like Amazon's Echo devices, Meta's Ray-Ban smart glasses are now switching to always-on data collection settings, unless users actively disable them. The company has removed key privacy opt-outs while quietly boosting how much its AI learns from your world. In an email sent to users on April 29, the company revealed that the glasses will now have Meta AI camera functionality turned on automatically, unless the wake word "Hey Meta" is disabled manually.
[9]
Ray-Ban Meta Smart Glasses Now Act Like a Live Language Interpreter
Ray-Ban Meta Smart Glasses Review: A True Beginning for Intelligent Frames Ray-Ban Meta smart glasses now support live translation and visual currency conversion. These features, which leverage Meta AI, could be useful to international travelers or expats. Live translate was first announced at the Meta Connect conference in October 2024. The feature, which is more like a live interpreter than a translator, is pretty straightforward -- start a conversation with someone, and a spoken interpretation of their language will be pumped through your Meta Ray-Ban glasses' speakers. If they're speaking Spanish, you'll hear English, and so on. That said, conversations are a two-way street. And unless you're speaking with someone who happens to own Ray-Ban smart glasses, you need to pull out your phone to show them a written translation of your words. Plus, the live translate feature is currently limited to English, Spanish, French, and Italian, and it doesn't work automatically, so you need to trigger it with a voice command. The good news is that you don't need an internet connection to use live translate, so long as you download the appropriate language packs before kicking off a conversation. "Whether you're traveling to a new country and need to ask for directions to the train station or you're spending quality time with a family member and need to break the language barrier, you can hold seamless conversations across English, French, Italian, and Spanish -- no Wi-Fi or network connectivity required if you've downloaded the language pack in advance. When you're speaking to someone in one of those languages, you'll hear what they say in your preferred language through the glasses in real time, and they can view a translated transcript of the conversation on your phone. To get started, just say, 'Hey Meta, start live translation.'" Visual currency conversion, while not as groundbreaking as live translate, is also an exciting feature for international travelers. Let's say that you're visiting Italy and want to know how the local price of parmigiano reggiano compares to what you buy back home (I did this when visiting Europe because I love cheese). If you're wearing Ray-Ban Meta glasses, you can just stare at the price tag on the cheese, say a phrase like "hey Meta, convert this price to USD," and the smart glasses will perform a quick currency conversion. But currency conversion is not a standalone feature. It's just one of the many things that you can do with Meta Live AI, a tool that's rolling out to North America in the coming weeks. Live AI can "see" through your eyes to answer unique questions, like what bottle of wine you should pair with your meal. It's honestly a bit of an awkward and contrived idea, at least in its current form (the above example, which I copied from Meta's press release, is genuinely ridiculous) the AI powers some genuinely good stuff like currency conversion and will probably grow more useful when commercial smart glasses gain visual AR functionality, which Meta is actively working on. Related Ray-Ban Meta Smart Glasses Review: A True Beginning for Intelligent Frames AI built-in makes these glasses smart, but Ray-Ban's style, decent camera, and great audio make them cool. Posts Along with all this travel-friendly stuff, Meta is adding the ability to direct message and make calls on Instagram from the Ray-Ban smart glasses. Similar functionality is already available for WhatsApp, Facebook Messenger, Google Message, and iMessage. There are also new lens and frame color combinations for the Ray-Ban Meta Skyler frame style. The Ray-Ban Meta smart glasses start at $300 and come in a wide variety of colors and styles. You can order them with prescription lenses, and there are options for polarized or Transition lenses. Note that some frame options are not available to those who, like me, have a prescription in excess of -3.5 or +3.5. Ray-Ban Meta Smart Glasses Embraced by the next generation of culture makers, its journey continues with AI-enhanced wearable tech. Listen, call, capture, and live stream features are seamlessly integrated within the classic frame. See at Amazon See at Meta See at Ray-Ban Source: Meta via The Verge
[10]
Live translation now available to all in Ray-Ban Meta glasses - 9to5Mac
Ray-Ban Meta glasses now offer a live translation feature to all owners, after an earlier small-scale test. You can download language packs in advance, enabling translation even when you don't have a mobile data connection. The Meta AI features, which lets you ask the glasses questions about what you're seeing, has also now launched in EU countries, and there are new frame and lens combos too ... The smart glasses have so far offered two main features. First, you can shoot POV photos and videos using either a button-press or a voice command like "Hey Meta, shoot a video." Second, you can ask the built-in AI to tell you about whatever it is you're looking at, with a "Hey Meta, tell me what you see" command. You can see a bunch of examples of this in my review of the glasses, with more in a follow-up post. The feature can be incredibly impressive - for example: The text was definitely too small in the frame for the glasses to have cheated. That feature was initially withheld from the EU due to privacy concerns, but is now available in all EU countries. Meta has been testing a live translation feature, and says it's now available to all users. Whether you're traveling to a new country and need to ask for directions to the train station or you're spending quality time with a family member and need to break the language barrier, you can hold seamless conversations across English, French, Italian, and Spanish -- no Wi-Fi or network connectivity required if you've downloaded the language pack in advance. When you're speaking to someone in one of those languages, you'll hear what they say in your preferred language through the glasses in real time, and they can view a translated transcript of the conversation on your phone. To get started, just say, "Hey Meta, start live translation." Other languages are likely to follow. I'll be putting this to the test shortly - watch this space. Meta has also announced new frame and lens combos. Starting today, we've got new and expanded Skyler frame and lens color combos. The new Skyler Shiny Chalky Gray with Transitions® Sapphire lenses is perfect for those who favor a cat-eye shape in a neutral tone with a pop of color and great for indoor and outdoor use. But if you prefer a more timeless aesthetic, you can also now grab Skyler Shiny Black with G15 Green lenses or Skyler Shiny Black with Clear lenses. The glasses are now available in a wide range of style and color combos. There's more to come, including Instagram integration and more conversational exchanges.
[11]
All Meta Ray-Ban Smart Glasses Users Get Live Translation, Live A Coming Soon
Meta today announced that it is rolling out a new live translation feature to all Ray-Ban Meta glasses users, providing wide access to functionality that was previously only available to early access users in a beta capacity. Live translation supports English, French, Italian, and Spanish, allowing users to translate between those languages in real-time while having a conversation. As long as a particular language pack has been downloaded in advance, no Wi-Fi or cellular connection is required to use the feature. When you're speaking to someone in one of those languages, you'll hear what they say in your preferred language through the glasses in real time, and they can view a translated transcript of the conversation on your phone. To get started, just say, "Hey Meta, start live translation." Live translate will translate what the person speaking in another language is saying, while responses can be seen on a connected smartphone. In the near future, Meta plans to introduce live AI, a feature where the Meta smart glasses can see whatever the wearer sees through the built-in camera, allowing for real-time AI conversations. Meta says the glasses will be able to provide hands-free help with meal prep, gardening, exploring, and more. Questions can be asked without the need to say a wake word, and the AI can understand context between requests for referencing prior queries. Meta's smart glasses should be of interest to Apple users because they provide some insight into what we might see from Apple in the future. Rumors suggest that Apple is considering developing smart glasses that are similar to Meta's Ray-Bans. Apple glasses could feature AI, microphones, and cameras, though there wouldn't be augmented reality capabilities.
[12]
Meta Now Collects More Data From Ray-Bans to Bolster AI
Meta this week sent out an email (via The Verge) to Meta Ray-Ban customers informing them about upcoming privacy changes to the smart glasses, which will increase the amount of data that Meta is collecting by default. Meta says that voice recordings are stored by default when using Meta AI, and used to improve Meta products. Meta has eliminated the option to disable voice recording storage, and recordings need to be manually deleted in settings. Further, unless Meta AI is disabled on the Ray-Ban glasses, photos and videos taken with the built-in camera will be analyzed by AI and used for improving Meta's products. From Meta's email: The Meta Ray-Ban glasses are not continually recording footage that's accessible to AI, but Meta AI will store and use photos, videos, and voice recordings from Ray-Ban users who interact with Meta AI or use voice commands. With cloud processing on, media is also sent to Meta's servers where it can be used to improve Meta services. Of course, uploading images and video to Instagram and other Meta apps also gives Meta the exact same access. So if you say "Hey Meta, record a video," by default Meta records the voice command and stores the recording and an audio transcript of it, a feature that is now turned on by default and can't be turned off. If cloud processing is also on, or if you ask Meta AI a question about the video, Meta can access and use the video for AI training purposes. Turning off Meta AI entirely on the Ray-Ban glasses and using manual controls for snapping photos and videos is the best method to ensure that Meta isn't collecting excessive data. More information is available in Meta's privacy policy.
[13]
Your Ray-Ban Meta smart glasses just became an even better travel companion thanks to live translation
Live translation tools are rolling out everywhere the glasses are available Following the rollout of enhanced Meta AI features to the UK earlier this month, Meta has announced yet another update to its Ray-Ban smart glasses - and it's one that will bring them closer to being the ultimate tourist gadget. That's because two features are rolling out more widely: look and ask, and live translation. Thanks to their cameras, your glasses (when prompted) can snap a picture and use that as context for a question you ask them, like "Hey Meta, what's the name of that flower?" or "Hey Meta, what can you tell me about this landmark?" This tool was available in the UK and US, but it's now arriving in countries in Europe, including Italy, France, Spain, Norway and Germany - you can check out the full list of supported countries on Meta's website. On a recent trip to Italy I used my glasses to help me learn more about Pompeii and other historical sites as I travelled, though it could sometimes be a challenge to get the glasses to understand what landmark I was talking about because my pronunciation wasn't stellar. I also couldn't find out more about a landmark until I learnt what it was called, so that I could say its name to the glasses. Being able to snap a picture instead and have the glasses recognize landmarks for me would have made the specs so much more useful as a tour guide, so I'm excited to give them a whirl on my next European holiday. The other tool everyone can get excited for is live translation, which is finally rolling out to all countries that support the glasses (so the US, UK, Australia, and those European countries getting look and ask). Your smart specs will be able to translate between English, French, Italian, and Spanish. Best of all you won't need a Wi-Fi connection, provided you've downloaded the necessary language pack. What's more, you don't need to worry about conversations being one-sided. You'll hear the translation through the glasses, but the person you're talking to can read what you're saying in the Meta View app on your phone. Outside of face-to-face conversations I can see this tool being super handy for situations where you don't have time to get your phone out, for example to help you understand public transport announcements. Along with the glasses' sign-translation abilities, these new features will make your specs even more of an essential travel companion - I certainly won't be leaving them at home the next time I take a vacation.
[14]
The best smart glasses you can buy just got a lot creepier
Meta rolled out a critical change to its privacy policy that makes the best smart glasses a little less desirable. On April 29, Meta implemented a drastic change to its privacy policy for Meta Ray-Ban smart glasses by removing the option to disable storage of your voice command data in the cloud. This means you now have no choice but to allow Meta to store and analyze your voice command recordings if you choose to use Meta AI with your Ray-Ban smart glasses. Unfortunately, the move is sacrificing user privacy to get more data for training its AI, which begs the question: Is a smarter AI assistant worth compromising your data privacy? See also: Best phone deals in April 2025 In a notice effective as of April 29, Meta stated, "We will store voice recordings even if you unintentionally activate a voice interaction. If our systems detect that you didn't intend to activate a voice interaction, we will label these voice interactions as 'false wakes' or misactivations, and delete them within 90 days of detection. Voice transcripts and stored audio recordings are otherwise stored for up to one year to help improve Meta's products." When Meta says your data is being used to "help improve Meta's products," it's most likely referring to its AI platform, which requires massive amounts of data to analyze and learn from. Meta noted that you can still delete your voice recording data at any time, which will prevent Meta from using that data to train its AI. However, it's frustrating that Meta is forcing users to manually do this. If you use your Meta smart glasses frequently and don't want your voice data stored, you now have to remember to manually delete your data on a regular basis. As if storing all of your voice recordings wasn't creepy enough, Meta also tweaked its policy around video recording on the Ray-Ban smart glasses. According to The Verge, Meta reportedly sent an email to current Ray-Ban smart glasses users stating that it will be analyzing any photos and videos you take with the glasses' built-in camera if you have Meta AI voice commands turned on. The only way to keep those photos and videos to yourself, according to The Verge, is to completely turn off Meta AI voice commands, meaning you'll be limited to using physical touch controls with your glasses. Laptop Mag has reached out to clarify its collection policy and will update this story with any further information. These privacy updates come as rumors begin heating up about the next generation of Meta's Ray-Ban smart glasses, which are expected to arrive by the end of the year. Meta has been putting increasing emphasis on smart glasses and AR glasses recently, but as impressive as some of the demos have been, privacy updates like this should give you pause before slipping on a pair of Meta's glasses. Any time you consent to allow a company to store voice recording data, you run the risk of your data being misused or sensitive data getting mistakenly picked up. For instance, imagine you order a pizza while wearing your Meta Ray-Ban glasses, and Meta's AI accidentally records your credit card info? Or say you answer a phone call from your doctor while wearing the glasses, and Meta AI wakes up on accident? There are countless situations where voice recording (with no real opt-out option) can pose a serious risk to your privacy and your data. As impressive as smart glasses can be, you might want to think twice about them after these policy updates.
[15]
Meta is now selling a conversation translator that is worn on your face
TL;DR: Meta's updated Ray-Ban smart glasses now feature live translation for Spanish, English, French, and Italian, enabling real-time multilingual conversations. Through the power of artificial intelligence, two people who cannot understand each other's languages can now communicate fluidly. Well, almost. Meta has rolled out a new update to its Ray-Ban smart glasses that adds a live translator feature where a wearer can simply say, "Hey Meta, start live translation," and the feature will be enabled. Once enabled, the conversation can begin, and the wearer of the glasses will hear the translated conversation in their preferred language. As always, there are some caveats. Currently, only four languages are supported: Spanish, English, French, and Italian. Another caveat is that for live translation to work in the ideal setting, both people should be wearing a pair of Meta's Ray-Ban smart glasses. However, the feature does work if one person is wearing the glasses and the other is using the Meta app to receive the translation. What's nice is that users can download each language, making the translation feature available offline. This means you won't need an active internet connection to take advantage of live translation. "You'll also soon be able to send and receive direct messages, photos, audio calls, and video calls from Instagram on your glasses. This joins being able to make calls and send messages through WhatsApp and Messenger, as well as the native messaging app on your iPhone or Android phone. No matter where your crew communicates, you can stay plugged in and connected effortlessly. Just say, "Hey Meta, send a message to Lisa on Instagram." - writes Meta in the announcement
[16]
Meta adds real-time AI features to Ray-Ban smart glasses
Meta has rolled out a significant update to its Ray-Ban Smart Glasses, introducing a "Hey Meta" voice prompt that enables users to access Meta's AI assistant directly. This feature allows users to ask the AI chatbot for assistance, identify objects and people, and perform various tasks. The "Hey Meta" feature is part of the "live AI" capability, powered by machine learning technology. Users can ask Meta AI to identify people, objects, or places visible through the glasses, and seek help with problems or alternatives. The AI assistant can also translate languages in real-time, supporting English, French, Italian, and Spanish. This is achieved through the glasses' built-in microphone and open-ear speakers. Users can also utilize the voice command to post on Instagram, send messages on Messenger, and play music via streaming platforms. The Ray-Ban Meta Smart Glasses have received several updates since their release, including integration with Apple Music, which can be controlled using voice commands. The latest update builds upon the introduction of Meta AI to the smart glasses last year, which brought a multimodal AI experience using the Llama model. Meta continues to expand the capabilities of its wearable device, offering users a more comprehensive and integrated experience. The features available through the "Hey Meta" voice prompt include:
[17]
Meta expands AI access on Ray-Ban smart glasses in Europe
Meta is expanding its AI assistant on Ray-Ban smart glasses to seven more European countries. Users can interact via voice, with new features like live translation and object recognition. Initially delayed due to EU privacy rules, the rollout reflects Meta's push to enhance wearables and boost its AI presence in Europe.Meta Platforms said on Wednesday it is expanding access to its artificial intelligence assistant, Meta AI, on Ray-Ban smart glasses to seven additional European countries. People in Germany, Austria, Belgium, Denmark, Norway, Sweden and Finland will now be able to interact with Meta AI using voice prompts to get answers to general questions, the Facebook and Instagram parent said. Meta launched its AI technology in Europe in March, a rollout that was initially announced in June last year but was delayed following regulatory concerns on data protection and privacy. While Meta AI was launched in the U.S. in 2023, its release in Europe faced several hurdles due to the European Union's stringent privacy and transparency rules. Sprucing up its wearable technology with AI capabilities could help Meta attract new users at a time when the company is investing billions of dollars in bolstering its AI infrastructure. Meta said the expansion will also include a live translation feature, which is being broadly rolled out in its markets. It will be releasing a feature, where people can ask Meta AI about the things they are looking at and get real-time responses, in supported countries in the EU starting next week. The company updated Ray-Ban Meta smart glasses with AI video capability and real-time language translation functionality in December 2024. Meta had first announced the features during its annual Connect conference in September last year.
Share
Share
Copy Link
Meta has updated its privacy policy for Ray-Ban Meta smart glasses, enabling AI features by default and expanding AI capabilities across Europe, while also raising concerns about data collection and user privacy.
Meta has announced significant updates to its Ray-Ban Meta smart glasses, introducing new AI capabilities and expanding their availability across Europe. These changes, however, come with updated privacy policies that have raised concerns among users and privacy advocates.
Meta is rolling out several AI-powered features for Ray-Ban Meta smart glasses users in the United States, Canada, and now seven additional European countries 5. These features include:
Live AI: Users can activate Meta AI with the voice command "Hey Meta, start live AI," allowing the AI to analyze the wearer's surroundings and answer questions about what it sees 2.
Live Translation: The glasses can now translate conversations in real-time between English, French, Italian, and Spanish 2.
Object Recognition: Meta AI can identify and provide information about objects the wearer is looking at 2.
Social Media Integration: Users can post to Instagram or send messages on Messenger using voice commands 2.
Along with these new features, Meta has updated its privacy policy for Ray-Ban Meta glasses, which has raised some privacy concerns:
Default AI Activation: AI features are now enabled by default when the "Hey Meta" wake word is active 14.
Data Collection: Meta will analyze photos and videos taken with the glasses while certain AI features are switched on 1.
Voice Recording Storage: Meta has removed the option to disable voice recording storage. Recordings can be stored for up to one year to improve Meta's products 14.
Opt-Out Difficulties: Users who don't want their data used for AI training must manually delete each recording from the companion app 1.
The updated privacy policy has sparked discussions about user privacy and data collection:
Consent Issues: Inna Tokarev Sela, CEO of AI data company illumex, raised concerns about filming without consent and suggested that users should be able to control what information is exposed when captured by the glasses 3.
AI Training Data: Meta's policy changes allow the company to collect more data to train its AI models, potentially improving features like accent and dialect recognition 1.
Comparison to Other Tech Giants: The move is similar to Amazon's recent policy change for Echo devices, which now process all voice commands through the cloud 1.
Meta's focus on AI-powered wearables indicates a strategic direction for the company:
Premium Model: Reports suggest Meta is planning a higher-end pair of Ray-Ban Meta glasses for release later in 2025, potentially priced around $1,000 4.
Competitive Landscape: These updates position Meta to compete with other tech giants in the AI and wearable technology space, such as Google's revived Google Glass concept 2.
As Meta continues to invest billions in AI infrastructure, the integration of AI capabilities into wearable technology could attract new users and solidify the company's position in the evolving landscape of AI-powered devices 5.
Reference
[1]
Meta confirms that images and videos analyzed by AI features in Ray-Ban Meta smart glasses may be used to train its AI models, raising privacy concerns for users and bystanders.
15 Sources
15 Sources
Meta has announced a range of new AI-driven features for its Ray-Ban smart glasses, including live translation, multi-modal AI, and enhanced video capabilities. These updates aim to make the glasses more versatile and useful in everyday life.
16 Sources
16 Sources
Meta rolls out significant AI-powered updates to Ray-Ban Meta Smart Glasses, including real-time visual AI, live translation, and Shazam integration, enhancing user experience and functionality.
22 Sources
22 Sources
Meta's Ray-Ban smart glasses receive a significant AI update, introducing multimodal features that enhance user interaction and functionality, potentially revolutionizing the smart glasses market.
3 Sources
3 Sources
Ray-Ban Meta smart glasses are outselling traditional Ray-Bans in many stores, with new AI features rolling out globally. The glasses' success has led to an extended partnership between Meta and EssilorLuxottica.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved