Curated by THEOUTPOST
On Thu, 26 Sept, 12:05 AM UTC
16 Sources
[1]
4 exciting Ray-Ban smart glasses features Meta just announced at Connect 2024
Improvements to the Meta Ray-Ban's multimodal AI bring unique AI features to the smart glasses, including the ability to 'remember' things for you. At the Meta Connect event earlier today, Mark Zuckerberg showed off a host of new features on the company's flagship Meta Ray-Ban smart glasses. Calling the glasses "the perfect form factor for AI," the new quality-of-life improvements center around the glasses' multi-modal AI for a more natural interaction (similar to what we saw with Google's Gemini and ChatGPT 4o). Also: Everything announced at Meta Connect 2024: Affordable Quest 3, AR glasses, and more But beyond improvements to communication, the glasses' multimodal AI allows for some interesting new interactions, giving them the ability to "see" what you see and "hear" what you hear with less context needed on the user's end. One of the most useful features is the glasses' ability to "remember" things for you -- taking note of specific numbers or visual indicators to file away for later. Here's a breakdown of everything that will be rolling out soon. Similar to other live translation technologies we've seen emerge this year, the Meta Ray-Bans will be getting a live translation feature designed to work in real-time (or at least close to it) with Spanish, French, and Italian. During the event, Zuckerberg demonstrated a conversation with a Spanish speaker, and the glasses translated what each speaker said and heard from Spanish into English in just seconds between the lines. Of course, not every conversation will involve two users wearing smart glasses, so the company is allowing users to sync their output with the Meta companion app, leveraging the smartphone to display translations. Also: Meta Ray-Ban Smart Glasses review: The best AI-powered AR glasses to buy right now In addition to the glasses' new features, Meta also teased its new translation AI tool for Instagram Reels that automatically translates audio into English and then uses AI to sync the speaker's mouth movements to match the English translation. The result -- in the demo at least -- was a natural-looking video in English using the speaker's own voice sample. So far, this feature is in its early stages, and only available in Spanish for now on Instagram and Facebook while Meta continues to test the technology. The demo also showed off the glasses' "photographic memory" by solving a problem we've all had: remembering where we parked. The user looked at the number on the parking spot and simply said, "Remember where I parked." Later, asking the glasses, "Hey Meta, where did I park?" invoked the AI to respond with the parking space number. This kind of "filing away" of knowledge on the fly is an example of utilizing what the AI is best at: recalling specific data in a pre-defined context. We'll have to test ourselves how reliable the feature will be for less visually hinted information. Additional usability examples of this feature are easy to imagine, looking at anything from grocery lists to event dates or phone numbers. Previously, you'd have to say "Hey Meta" to invoke the glasses' AI, then wait for the prompt to begin your inquiry. Now, you can simply ask questions about the glasses in real-time, even while in motion, utilizing the glasses' multimodal AI to analyze what you're seeing or hearing. Also: Meta's new 512GB Quest 3 deal may be the best VR headset offer right now One demo showed a user peeling an avocado and asking, "What can I make with these?", not specifying what "these" referred to. Another demo showed a user searching through a closet and pulling out multiple items of clothing at once, asking the AI to help style an outfit in real time. Like how other popular voice assistants have developed, you can always interrupt Meta AI when converting with it. Along the same lines, the multimodal capabilities of the glasses extend beyond simply analyzing what's in view in a static sense. The glasses will recognize things like URLs, phone numbers, which you can call, or QR codes, which you can scan instantly with the glasses. Lastly, Zuckerberg demoed a clever new accessibility feature of the glasses. Blind and vision-impaired people can use the glasses to broadcast what they see to a volunteer on the other end, who can talk them through the details of what they're looking at. Be My Eyes is an already-existing program that connects vision-impaired folks with virtual volunteers through live video. The demo showed a woman looking at a party invitation with dates and times. Still, real-world uses for this could essentially be anything from reading signs to shopping for groceries to navigating a tech gadget. Also: Google co-founder on the future of AI wearables (and his Google Glass regrets) Finally, Zuck showed off some new designs, including a new, limited edition of the Ray-Bans with clear, transparent frames, as well as the introduction of new transition lenses, effectively doubling their usability as both sunglasses and prescription glasses. The Meta Ray-Bans start at $300 and come in nine different frame designs and a new limited-edition transparent style.
[2]
4 exciting features coming to Meta's Ray-Ban smart glasses - including the ability to 'remember' things
Improvements to the Meta Ray-Bans' multimodal AI bring some truly unique features to the smart glasses, including the ability to 'remember' things for you. At the Meta Connect event earlier today, Mark Zuckerberg showed off a host of new features on the company's flagship Meta Ray-Ban smart glasses. Calling the glasses "the perfect form factor for AI," the new quality-of-life improvements center around the glasses' multi-modal AI for a more natural interaction (similar to what we saw with Google's Gemini and ChatGPT 4o). Also: Everything announced at Meta Connect 2024: Affordable Quest 3, AR glasses, and more But beyond improvements to communication, the glasses' multimodal AI allows for some interesting new interactions, giving them the ability to "see" what you see and "hear" what you hear with less context needed on the user's end. One of the most useful features is the glasses' ability to "remember" things for you -- taking note of specific numbers or visual indicators to file away for later. Here's a breakdown of everything that will be rolling out soon. Similar to other live translation technologies we've seen emerge this year, the Meta Ray-Bans will be getting a live translation feature designed to work in real-time (or at least close to it) with Spanish, French, and Italian. During the event, Zuckerberg demonstrated a conversation with a Spanish speaker, and the glasses translated what each speaker said and heard from Spanish into English in just seconds between the lines. Of course, not every conversation will involve two users wearing smart glasses, so the company is allowing users to sync their output with the Meta companion app, leveraging the smartphone to display translations. Also: Meta Ray-Ban Smart Glasses review: The best AI-powered AR glasses to buy right now In addition to the glasses' new features, Meta also teased its new translation AI tool for Instagram Reels that automatically translates audio into English and then uses AI to sync the speaker's mouth movements to match the English translation. The result -- in the demo at least -- was a natural-looking video in English using the speaker's own voice sample. So far, this feature is in its early stages, and only available in Spanish for now on Instagram and Facebook while Meta continues to test the technology. The demo also showed off the glasses' "photographic memory" by solving a problem we've all had: remembering where we parked. The user looked at the number on the parking spot and simply said, "Remember where I parked." Later, asking the glasses, "Hey Meta, where did I park?" invoked the AI to respond with the parking space number. This kind of "filing away" of knowledge on the fly is an example of utilizing what the AI is best at: recalling specific data in a pre-defined context. We'll have to test ourselves how reliable the feature will be for less visually hinted information. Additional usability examples of this feature are easy to imagine, looking at anything from grocery lists to event dates or phone numbers. Previously, you'd have to say "Hey Meta" to invoke the glasses' AI, then wait for the prompt to begin your inquiry. Now, you can simply ask questions about the glasses in real-time, even while in motion, utilizing the glasses' multimodal AI to analyze what you're seeing or hearing. Also: Meta's new 512GB Quest 3 deal may be the best VR headset offer right now One demo showed a user peeling an avocado and asking, "What can I make with these?", not specifying what "these" referred to. Another demo showed a user searching through a closet and pulling out multiple items of clothing at once, asking the AI to help style an outfit in real time. Like how other popular voice assistants have developed, you can always interrupt Meta AI when converting with it. Along the same lines, the multimodal capabilities of the glasses extend beyond simply analyzing what's in view in a static sense. The glasses will recognize things like URLs, phone numbers, which you can call, or QR codes, which you can scan instantly with the glasses. Lastly, Zuckerberg demoed a clever new accessibility feature of the glasses. Blind and vision-impaired people can use the glasses to broadcast what they see to a volunteer on the other end, who can talk them through the details of what they're looking at. Be My Eyes is an already-existing program that connects vision-impaired folks with virtual volunteers through live video. The demo showed a woman looking at a party invitation with dates and times. Still, real-world uses for this could essentially be anything from reading signs to shopping for groceries to navigating a tech gadget. Also: Google co-founder on the future of AI wearables (and his Google Glass regrets) Finally, Zuck showed off some new designs, including a new, limited edition of the Ray-Bans with clear, transparent frames, as well as the introduction of new transition lenses, effectively doubling their usability as both sunglasses and prescription glasses. The Meta Ray-Bans start at $300 and come in nine different frame designs and a new limited-edition transparent style.
[3]
Meta Ray-Ban smart glasses just got a huge upgrade with live translation, multi-modal video and more
Meta Connect 2024 is all about AI on your face. Meta CEO Mark Zuckerberg stepped on to the stage to introduced the newest version of their AI-powered smart glasses. The big announcement regarding the newest version of Meta's smart glasses is much deeper integration of AI and potential new AI-powered features that are coming to the glasses in the future. Zuckerberg announced that with the updated AI, you'll be able to have more natural and conversational prompts using the smart glasses. An example he gave was prompting, "Hey Meta, what kind of smoothie can I make with these" while showing ingredients. From there you don't need to prompt with 'Hey Meta' and can just continue the conversation. The smartglasses are getting the ability to be a memory bank for you as you go through your day to day life like asking the glasses to remember where you parked, reading QR codes on flyers or calling numbers. Zuckerberg claimed that Meta AI on the glasses will be capable of mult-modal video meaning that it can give "real-time advice." For example, they showed someone getting ready for a party with a Roaring Twenties theme, the AI helped them pick out appropriate pieces for their outfit. Probably the most interesting feature Mark Zuckerberg showed off was live translation with the the smart glasses translating into English right in your ears. He said you could use a companion mobile app to translate for someone who isn't wearing smart glasses. Zuckerberg tested by having an English-Spanish conversation with Mexican MMA fighter Brandon Moreno. For people with low-vision or blindness, Zuckerberg announced a forthcoming partnership with Be My Eyes, which uses volunteers over video to help people see what they're looking at. It will soon be available to be used via the smartglasses so that users can show the volunteer what they're looking at and hear their responses in the glasses. Additionally, some smaller new features include voice control for Spotify and Amazon Music as well as new iHeart Radio and Audible integration. In a throwback to my see-through N64-loving heart, Zuckerberg announced a clear, limited-edition version of the Ray-ban Meta glasses where you can see the technology inside the glasses. It does appear that there won't be many of these ones if you're interested in them. Meta also announced that they are teaming up with EssilorLuxottica to create a variety of lenses from prescription to optical and transition.
[4]
Ray-Ban Meta Smart Glasses are getting even more AI features, like live language translations and Meta AI for live video
The Ray-Ban Meta Smart Glasses might be the most popular smart glasses around, and that's likely thanks to their feature set and housing. Meta partnered with Ray-Ban to bring the iconic Wayfarers into the technological age with slightly thicker frames, two cameras, some speaker tech, microphones, and connectivity. These smart glasses started out as a unique way to capture photos or videos, at times even more 'in the moment', given that you didn't need to take your phone out and launch the camera. In recent months, Meta has infused these smart glasses with Meta AI, enabling you to look at something and simply say, "Hey Meta, what's this?" and let it look, analyze it, and then provide you an answer. It's pretty neat. Now, though, at Meta Connect 2024, the team working on the Smart Glasses wants to make these even smarter - and if you guessed that they're doing that with AI, you'd be correct. Kicking things off with what might be the most helpful new feature, the Ray-Ban Meta Smart Glasses will get live language translation later this year. Similar to what Samsung accomplished with the Galaxy Buds Pro 2 or Google with the Pixel Buds Pro 2, the Ray-Ban Metas will be able to translate languages near-live, initially between English and Spanish, Italian, and French. This could prove pretty helpful, and more natural than attempting to do this with earbuds in, as it's baked into the smart glasses, which you might already be wearing daily if you've opted to install prescription lenses. Furthermore, beyond asking Meta to set a reminder verbally, you can now set up reminders based on things that you see, and therefore Meta is viewing - so it could be as you're getting milk out of the fridge and realize you're almost out, or maybe a package that you left near your front door that you need to be sure you take with you. This feature should be rolling out sooner rather than later. Similarly, you'll now be able to scan QR codes for events, phone numbers, and even full contact information. If a QR code is visible, you'll be able to ask Meta via the Ray-Ban Meta Smart Glasses to scan it - we imagine the information will then appear in the Android or iOS companion app. Likely the most ambitious forthcoming feature, also set to arrive later this year, Meta AI for video, meaning that Meta can view what you're looking at in real time, not just an image snapshot, and provide clarity or answer questions. This could be helpful for navigating around a city, cooking a meal, completing a math problem, or helping you finish a Lego set. This is likely a big step and would raise some privacy concerns, as it's a live view from your glasses that's being processed immediately. You'll also need the Ray-Ban Meta Smart Glasses to be connected to the internet via an iPhone or Android for this to work, as it would need to process the information in real time. Still, though, it does likely give us an idea of where Meta is headed with the smart glasses category, and it's great to see that Meta is continuing to roll out new features to the smart glasses. And that's the good news here - these updates don't, so far as we know, require new hardware. These are set to arrive as over-the-air updates for folks who already have the Ray-Ban Meta Smart Glasses, or who purchase them. Another update that's on the way is integration with Amazon Music, Audible, iHeart, and Spotify integration, which will give you easier access to your favorite songs, artists, podcasts, and books hands-free. You'll also see new Transition lens options arriving from EssilorLuxottica, the eye giant behind brands ranging from Dolce & Gabbana to Oakley. So, if you haven't loved the available looks enough to get a pair yet, or want to freshen yours up, once those hit the scene it'll a good time to consider them again. We'll be going hands-on with these new features, from language translation to Meta AI for video, as soon as we can, so stay tuned to TechRadar.
[5]
Meta updates Ray-Ban smart glasses with real-time AI video, reminders, and QR code scanning
Meta CEO Mark Zuckerberg announced updates to the company's Ray-Ban Meta smart glasses at Meta Connect 2024 on Wednesday. Meta continued to make the case that smart glasses can be the next big consumer device, announcing some new AI capabilities and familiar features from smartphones coming to Ray-Ban Meta later this year. Some of Meta's new features include real-time AI video processing and live language translation. Other announcements -- like QR code scanning, reminders, and integrations with iHeart Radio and Audible -- seem to give Ray-Ban Meta users the features from their smartphones that they already know and love. Meta says its smart glasses will soon have real-time AI video capabilities, meaning you can ask the Ray Ban Meta glasses questions about what you're seeing in front of you, and Meta AI will verbally answer you in real time. Currently, the Ray-Ban Meta glasses can only take a picture and describe that to you or answer questions about it, but the video upgrade should make the experience more natural, in theory at least. These multimodal features are slated to come later this year. In a demo, users could ask Ray-Ban Meta questions about a meal they were cooking, or city scenes taking place in front of them. The real-time video capabilities mean that Meta's AI should be able to process live action and respond in an audible way. This is easier said than done, however, and we'll have to see how fast and seamless the feature is in practice. We've seen demonstrations of these real-time AI video capabilities from Google and OpenAI, but Meta would be the first to launch such features in a consumer product. Zuckerberg also announced live language translation for Ray-Ban Meta. English speaking users can talk to someone speaking French, Italian, or Spanish, and their Ray-Ban Meta glasses should be able to translate what the other person is saying into their language of choice. Meta says this feature is coming later this year and will include more language later on. The Ray-Ban Meta glasses are getting reminders, which will allow people to ask Meta AI to remind them about things they look at through the smart glasses. In a demo, a user asked their Ray-Ban Meta glasses to remember a jacket they were looking at, so they could share the image with a friend later on. Meta announced that integrations with Amazon Music, Audible, and iHeart are coming to its smart glasses. This should make it easier for people to listen to music on their streaming service of choice using the glasses' built-in speakers. The Ray-Ban Meta glasses will also gain the ability to scan QR codes or phone numbers from the glasses. Users can ask the glasses to scan something, and the QR code will immediately open on the person's phone with no further action required. The smart glasses will also be available in a range of new Transitions lenses, which respond to ultraviolet light to adjust to the brightness of the room you're in.
[6]
Ray-Ban Meta Smart Glasses are more of an AI device than ever with new updates
Here's what's coming soon to Ray-Ban Meta Smart Glasses: A lot of AI features. Credit: Meta If you do a quick online search for Ray-Ban Meta Smart Glasses right now, you'll find that the wearable is mostly marketed for its quick photo capturing and livestreaming capabilities. However, at the Meta Connect 2024 event on Wednesday, Meta founder and CEO Mark Zuckerberg didn't have too much to say about photos and videos during the Ray-Ban Meta Smart Glasses section of the presentation. In fact, Zuckerberg introduced the Ray-Ban Meta Smart Glasses primarily as an AI device. "Glasses are a new AI device category," Zuckerberg said, noting that his company has just caught up with the consumer demand for Meta smart glasses after sales took off faster than he said he expected. Aside from a new limited edition Ray-Ban Meta Smart Glasses device with clear transparent frames, there weren't any new smart glasses hardware announcements from Meta. However, Zuckerberg did share several new features that he said were coming to the Meta smart glasses in a set of updates releasing over the next couple of months -- all of them AI related. Meta AI is already integrated into Ray-Ban Meta Smart Glasses in much the same way other companies' voice assistant's are integrated into their devices. But, according to Zuckerberg, new updates will make these interactions "more natural and conversational." For example, currently, users have to prompt their Ray-Ban Meta Smart Glasses with the phrase "look and tell me" when they have a question. Zuckerberg's demo showcased how users will no longer have to do that. Users will just need to activate the feature with the "Hey Meta" prompt and then ask their question. Meta AI will automatically know the question is in regards to whatever the user is looking at through the glasses. Furthermore, after the initial "Hey Meta," Meta AI will no longer require that users start each prompt with that phrase. Meta AI will be able to continue interacting with users. The latter feature is similar to what's been seen in other smart glasses when it comes to translations. A user can access live real-time audio translations of another language through the glasses when conversing with another person. The demo seemed to work nearly perfectly at Meta Connect when translating from Spanish to English and English to Spanish. Zuckerberg explained the multimodal video AI feature through a demo showing a user trying on outfits to wear. Through this feature, Meta AI was able to offer fashion advice and suggestions based on the user's outfit and their specific question about it. Ray-Ban Meta Smart Glasses will also soon be able to automatically remember things for users. The example showcased at Meta Connect involved Meta AI recalling the parking space number where the user parked their car. The user did not have to prompt Meta AI to do that. It naturally appeared to remember the number because the user viewed it through the glasses. Adding on to that feature is a similar Meta AI capability where Ray-Ban Meta Smart Glasses users will soon be able to look at a flier or advertisement and ask the smart glasses to call the phone number or scan the relevant QR code. The glasses can also automatically remember those things as well if a user wants to go back to what they previously viewed through the glasses at a later time. Other updates coming to Ray-Ban Meta Smart Glasses include the ability to voice control Spotify and Amazon Music through the device as well as new integrations with apps like Audible and iHeartRadio. Meta also announced a partnership with Be My Eyes, a mobile app that connects blind and low-vision people with volunteers via live video to talk through what's in front of them. The app will work directly through Ray-Ban Meta Smart Glasses and volunteers will be able to see through the user's glasses in order to provide assistance.
[7]
Meta unveiled a huge update to its Ray-Ban Smart Glasses | Digital Trends
Ray-Ban Meta Smart Glasses have been a big success and the company is continuing to expand the capabilities of these stylish tech shades that include a camera and speakers. You'll soon get live translation, reminders, and more, along with a new clear style. Since these Ray-Bans can see and hear, Meta is leveraging the advanced AI capabilities of its new Llama 3.2 model to enable live translation. In a live demo, Meta founder Mark Zuckerberg spoke with Brandon Moreno, one speaking English and the other Spanish, while their Meta glasses translated for each person. Despite a small delay -- about one to three seconds before the AI spoke the translation -- it's a great addition for owners of these smart glasses. Zuckerberg mentioned live translations of Spanish, French, and Italian. More languages will follow. Recommended Videos Another Meta AI improvement gives it memory. You'll be able to ask your smart glasses to remind you where you parked or to pick up apples at the store. With Ray-Ban Meta Smart Glasses becoming more helpful in everyday life, you might find yourself reaching for them more often. The Ray-Ban smart glasses can already describe and answer questions about what you see, snapping a photo with the integrated camera. Now that multimodal AI is getting more practical capabilities. With the latest update, you can scan QR codes with your Ray-Ban Meta Smart Glasses and call phone numbers you see in print or on a billboard just by asking. Your stylish Ray-Bans will also give fashion advice when you're selecting clothes and jewelry. Be My Eyes is a game-changing service for people with low vision or blindness, getting assistance from a sighted volunteer. Until now, a phone served as the live-stream camera but now Ray-Ban Meta Smart Glasses are all that's needed. Ray-Ban Meta Smart Glasses also serve as earbud replacements, letting you take calls and play music. Meta has improved integrations with Spotify and Amazon Music, and added Audible and iHeart. There will be more integrations over time and ongoing improvements to the AI, making the Ray-Ban Meta Smart Glasses a device that keeps getting better. To celebrate the success of this step toward an AR future, Meta is launching a new limited-edition version of Ray-Ban Meta Smart Glasses in clear frames. We've seen these worn by Meta CTO Andrew "Boz" Bosworth in the past, and now you'll be able to get them too.
[8]
Meta Teaches Its Ray-Ban Smart Glasses Some New AI Tricks
The on-board Meta AI agent can now set a reminder to buy whatever the camera is pointed at, among other enhancements. The Ray-Ban Meta glasses are the first real artificial intelligence wearable success story. In fact, they are actually quite good. They've got that chic Ray-Ban styling, meaning they don't look as goofy as some of the bulkier, heavier attempts at mixed reality face computers. The on-board AI agent can answer questions, and even identify what you're looking at using the embedded cameras. People also love using voice commands to capture photos and videos of whatever is right in front of them without whipping out their phone. Soon, Meta's smart glasses are getting some more of these AI-powered voice features. Meta CEO Mark Zuckerberg announced the newest updates to the smart glasses' software at his company's Meta Connect event today. The company also used Connect to announce its new Meta Quest 3S, a more budget-friendly version of its mixed reality headsets. It also unveiled a host of other AI capabilities across its various platforms, with new features being added to its Meta AI and Llama large language models. As far as the Ray-Bans go, Meta isn't doing too much to mess with a good thing. The smart spectacles got an infusion of AI tech earlier this year, and now Meta is adding more capabilities to the pile, though the enhancements here are pretty minimal. You can already ask Meta AI a question and hear its responses directly from the speakers embedded in the frames' temple pieces. Now there are a few new things you can ask or command it to do. Probably the most impressive is the ability to set reminders. You can look at something while wearing the glasses and say, "Hey, remind me to buy this book next week," and the glasses will understand what the book is, then set a reminder. In a week, Meta AI will tell you it's time to buy that book. Meta says live transcription services are coming to the glasses soon, meaning people speaking in different languages could see transcribed speech in the moment -- or at least in a somewhat timely fashion. It's not clear exactly how well that will work, given that the Meta glasses' previous written translation abilities have proven to be hit or miss. There are new frame colors and lens colors being added, and customers now have the option to add transition lenses that increase or decrease their shading depending on the current level of sunlight. Meta hasn't said exactly when these additional AI features will be coming to its Ray-Bans, except that they will arrive sometime this year. With only three months of 2024 left, that means very soon.
[9]
Meta's Ray-Ban branded smart glasses are getting AI-powered reminders and translation features
Barring some new transition lenses, there's no updated hardware this time. Meta's AI assistant has always been the most intriguing feature of its second-generation Ray-Ban smart glasses. While the generative AI assistant had fairly limited capabilities when the glasses launched last fall, the addition of real-time information and multimodal capabilities offered a range of new possibilities for the accessory. Now, Meta is significantly upgrading the Ray-Ban Meta smart glasses' AI powers. The company showed off a number of new abilities for the year-old frames onstage at its Connect event, including reminders and live translations. With reminders, you'll be able to look at items in your surroundings and ask Meta to send a reminder about it. For example, "hey Meta, remind me to buy that book next Monday." The glasses will also be able to scan QR codes and call a phone number written in front of you. In addition, Meta is adding video support to Meta AI so that the glasses will be better able to scan your surroundings and respond to queries about what's around you. There are other more subtle improvements. Previously, you had to start a command with "Hey Meta, look and tell me" in order to get the glasses to respond to a command based on what you were looking at. With the update though, Meta AI will be able to respond to queries about what's in front of you with more natural requests. In a demo with Meta, I was able to ask several questions and follow-ups with questions like "hey Meta, what am I looking at" or "hey Meta, tell me about what I'm looking at." When I tried out Meta AI's multimodal capabilities on the glasses last year, I found that Meta AI was able to translate some snippets of text but struggled with anything more than a few words. Now, Meta AI should be able to translate longer chunks of text. And later this year the company is adding live translation abilities for English, French, Italian and Spanish, which could make the glasses even more useful as a travel accessory. And while I still haven't fully tested Meta AI's new capabilities on its smart glasses just yet, it already seems to have a better grasp of real-time information than what I found last year. During a demo with Meta, I asked Meta AI to tell me who is the Speaker of the House of Representatives -- a question it repeatedly got wrong last year -- and it answered correctly the first time.
[10]
Meta Connect 2024: Ray-Ban Meta Glasses Get an AI Overhaul
Meta has officially launched its much anticipated Quest 3s at the Meta Connect 2024. However, alongside the highlight product, the tech giant has also announced AI improvements for the smart Ray-Ban glasses. For those unaware, this Meta-Ray-Ban collaborated pair of smart glasses is powered by Meta AI and the new multimodal Llama 3.2 models. Ray-Ban smart glass users can simply summon the AI assistant by saying, "Hey, Meta." With the assistant summoned, users can then ask questions and receive audio responses in return. In addition, using the integrated camera system of the smart glasses, users can also take images and ask questions related to them. However, that was about it. Now, a slew of new AI features make smart glasses a bit more versatile. For starters, there's Reminders, which lets you ask Meta AI to remind you of important things that you see in your day-to-day life. Additionally, you can also scan QR codes directly from the glasses now. Moving on, Ray-Ban smart glasses are also getting support for video on Meta AI, thanks to the latest Llama model. Using this, users will be able to feed Meta AI videos of their surroundings and ask related questions to have a more natural conversation with the assistant. However, this feature is coming later this year. The Meta AI-backed glasses will also be getting Live Language Translation, which will let users communicate in English, French, Italian, or Spanish. However, this is also slated to land later this year. Additionally, users will also be able to enjoy integrations with Spotify, Amazon Music, Audible, and iHeart. Finally, the glasses are also getting support for a new range of Transitions lenses from EssilorLuxottica. Now, talking about some of the technical specifications of the Ray-Ban Meta Smart Glasses, they feature a 12MP ultra-wide sensor which allows you to take panoramic photos of 3024 x 4032 resolution. As for videos, the glasses can capture 1440 x 1920 videos at 30FPS. Moreover, alongside the voice assistant, you can also make use of touch controls on the glasses. In the battery department, the glasses claim to deliver 4 hours of backup on a single charge. You also get 32GB of internal storage, along with Bluetooth 5.2 and Wi-Fi 6. There are also two open-ear speakers on each side alongside five microphones. As for pricing, Ray-Ban's Meta series of smart glasses starts at $299.
[11]
Meta Ray-Bans are getting live translation | TechCrunch
During Wednesday's Meta Connect event, CEO Mark Zuckerberg announced a slew of new AI-fueled features for the company's Ray-Ban collaboration. The most interesting of the bunch is the addition of real-time translation through the glasses' speakers. Meta explains: Soon, your glasses will be able to translate speech in real time. When you're talking to someone speaking Spanish, French or Italian, you'll hear what they say in English through the glasses' open-ear speakers. Not only is this great for traveling, it should help break down language barriers and bring people closer together. We plan to add support for more languages in the future to make this feature even more useful. The companies have yet to announce a timeline for this specific AI addition, but depending on implementation, it could be a tremendously useful addition to the livestreaming glasses. Live translation has been a kind of holy grail for established hardware firm and startups alike. Google, notably, introduced a pair of concept glasses with a head's-up display capable of translating in real time. That, however, never made it past the prototype stage. Meta has yet to announce the languages that will be initial available, though judging from the above statement, it seems them will initially be limited to romance languages like English, Spanish, French, and Italian.
[12]
Meta Ray-Bans Are Getting AI Improvements and I Got to Test Them Out
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps In a year of promising AI gadgets, Meta's second-gen Ray-Ban smart glasses ended up being one of the best, and something I kept on my face for far longer than I ever expected. They've been an unexpected hit for Meta, too, and a vehicle for exploring some camera-enabled AI features that can analyze the real world on the fly. At Meta's Connect conference, a show dedicated to VR, AR and a lot of AI, Meta announced new updates Wednesday for the glasses, though no new hardware. Instead, last year's versions will see new features such as live translation, camera recognition of QR codes and phone numbers, and support for AI analysis of live video recording. I got to check out just a few of those features at a press event before the announcement, using a pair of Ray-Bans perched over my own glasses (I didn't bring contact lenses). The demos I walked through showed how I could look at a QR code with my glasses and then automatically open the linked website on my phone after my glasses snapped a photo. I looked at a model of a street full of toy cars and asked the glasses how they could describe the cars in front of me, and a photo was snapped without me having to say the usual "look" trigger word. I also tried deeper music control in the glasses, making specific music requests for tracks in Spotify (a feature that's also coming with support for Apple Music and Amazon Music). All of these features will work on iOS and Android using the Meta View app. I'm most interested in the live translation feature, and how quickly it might respond to actual conversations. I also wonder how AI assistance with recorded video clips might work, too. At some point, AI-assisted glasses might have a much more continuous awareness of my world with cameras. But taking photos and videos also drains battery life, something the Meta Ray-Bans already struggle with over the course of a full day. According to Meta, battery life improvements are also in the works, which I'd love to see, although I'm not sure how it'll do it. My demos did have a few connection hiccups, too, something I've experienced at times using Meta Ray-Bans connected to my iPhone. For example, sometimes voice requests will hang for music playlist playback in a way that Siri requests don't with AirPods. Regardless, expect the Meta Ray-Bans to keep getting better. That's good news for anyone who already has a pair, although I wonder when Meta will make strides towards a third-gen version -- maybe next year.
[13]
The Ray-Ban Meta Smart Glasses Will Now Remind You to Restock the Fridge
They'll also receive live language translation abilities later this year. Last year, we saw Meta slightly rebrand its smart glasses lineup by adding a host of new features to its previously unexciting 'Stories' and renaming them Ray-Ban Meta Smart Glasses. This time, the Menlo Park company improved its smart glasses even more by adding a few AI improvements. Some of these updates will not be available at the time of announcement at Meta Connect but will roll out later this year. Meta is bringing Reminders to its smart glasses. This means you could ask your glasses to remind you about something you see at that moment. Meta uses the example of running low on your favorite cereal. Your glasses will note when you inform them in the kitchen and remind them when you're at the grocery store. Regardless of how much we detest it, more and more things are getting behind QR codes. Ray-Ban Meta glasses can scan QR codes directly, alleviating the need to pull out your phone every time. One of the features coming later this year is Meta AI's ability to support video. This feature will allow you to share your POV and see what you see in real-time. Meta says this could be helpful in situations such as exploring a new city or preparing a meal. Ray-Ban Meta glasses will also receive live language translation abilities later this year. Initially, they will only allow translations between English and French, Italian, or Spanish, hopefully supporting more languages coming soon. The glasses will also allow integration with Spotify, Amazon Music, Audible, and iHeart. Last year, the glasses received redesigned speakers that were apparently 50% louder and more bass-heavy than their predecessors. I'm excited to test Spotify on the revamped audio output. Meta is also introducing five new Transitions lenses with their old lens partner and Ray-Bans owner, EssilorLuxottica.
[14]
'Hey Meta' Now Kicks Off Conversations With Ray-Ban AI Glasses
New AI features for Ray-Ban Meta smart glasses put an emphasis on conversational abilities. The wearer can now kick off a conversation by saying "Hey Meta," and ask follow-up questions without saying "Hey Meta" again. You also no longer have to say "look" to ask the glasses about something you're seeing. The glasses can act as a conversation partner while you're on-the-go, thanks to a newly added video feed that provides "continuous real-time help," Meta says. For example, they can point out landmarks on a walking tour, or meal plan while the wearer is walking around the grocery store. You can also ask the glasses to remember where you parked, or to record and send voice messages on WhatsApp and Messenger. Meta is adding real-time translation capabilities "soon." If the wearer is faced with someone speaking Spanish, French, or Italian, they will hear what they're saying in English through tiny, open-ear speakers. A slew of new partnerships integrate the glasses with popular apps like Spotify, Amazon Music, Audible, and iHeart. Meta is also partnering with Be My Eyes, a free app that "connects blind and low-vision people with sighted volunteers." The volunteer can step into the vision-impaired person's point of view and explain their surroundings. The Ray-Ban updates are in-theme with the other features announced today at Meta Connect. The event was all about virtual reality, including the launch of the budget-friendly Quest 3S, and the ability to have conversations with the Meta AI assistant in Meta-owned apps.
[15]
Ray-Ban Meta Glasses Hands On: More Assistant-like, Just Not Yet
Meta Connect announced new features for the smart glasses, but they're all still very basic. The Ray-Ban Meta glasses are the best way to secretly snap the world around you without shuffling for the phone in your pocket. But Meta wants its glasses to be legitimized as more than just an accessory with solid camera capabilities for documenting the day-to-day. It wants you to consider those glasses as your gateway to AI interaction. Those stylish sunglasses could also be the digital assistant in your ear. Unfortunately, that still seems like a world away compared to where the Ray-Ban Meta glasses are currently. At Meta Connect, the company announced several new AI-enhanced features for the Ray-Ban glasses. The ones I got to try during a demonstration include features like reminders, which let you ask Meta AI to remind you of things, and the ability to scan QR codes. I didn't use what Meta won't release until later this year. That includes the ability for Meta AI to help you translate live language for English, French, Italian, and Spanish. It's a bummer that those features aren't available yet, as they're the most assistant-like and could truly show the potential of Meta AI. After I asked it to, the Ray-Ban Meta glasses reminded me that bread was in the oven. Granted, I had to set a time for it to remind me. I initially requested it to remind me about the bread when I got to a particular place, but the Meta AI clapped back that it couldn't tap into location data quite yetâ€"again, a missed opportunity for a decidedly assistant-like feature. You can recall reminders you've left yourself as breadcrumbs for information. One example offered to me was asking Meta AI to remember where you parked in the garage at the airport. It can then recall that information when you ask it to fetch for it. But wouldn't it be better if you didn't have to ask it? Reminders are the kind of feature that would benefit from location data, something that Google and Apple's AI can already do. While it's nice that it's baked into this specific set of glasses, it has to do more soon.
[16]
Meta Platforms : Ray-Ban | Meta Glasses Are Getting New AI Features and More Partner Integrations
We're advancing our partnerships with Spotify and Amazon Music and adding new ones with Audible and iHeart to give you more ways to listen. Since we first launched ourRay-Ban Meta glasses, they've been so popular that we've had trouble keeping them on the shelves until recently. People have captured more than 100 million photos and videos with their glasses and shared millions of moments with friends and family since we first introduced them. Whether you're exploring a new city, sitting on the sidelines of a sporting event, or simply trying to be more present in your daily life, Ray-Ban Meta glasses are the perfect companion to help you experience the world, share your perspective and capture moments, completely hands-free. And now we're adding new AI features, expanding partnerships and more. First, we're making it easier to have a conversation with Meta AI. Kick off your conversation with "Hey Meta" to ask your initial question and then you can ask follow-up questions without saying "Hey Meta" again. And you no longer need to say "look and" toask Meta AI questions about what you're looking at. [Link] We're adding the ability for your glasses to help you remember things. Next time you fly somewhere, you don't have to sweat forgetting where you parked at the airport - your glasses can remember your spot in long-term parking for you. And you can use your voice to set a reminder to text your mom in three hours when you land safely. You can now ask Meta AI to record and send voice messages on WhatsApp and Messenger while staying present. This comes in especially handy when your hands are full or when you can't get to your phone easily to write out a text. We're adding video to Meta AI, so you can get continuous real-time help. If you're exploring a new city, you can ask Meta AI to tag along, and then ask it about landmarks you see as you walk or get ideas for what to see next - creating your own walking tour hands-free. Or, if you're at the grocery store and trying to plan a meal, you can ask Meta AI to help you figure out what to make based on what you're seeing as you walk down the aisles, and if that sauce you're holding will pair well with that recipe it just suggested. Soon, your glasses will be able to translate speech in real time. When you're talking to someone speaking Spanish, French or Italian, you'll hear what they say in English through the glasses' open-ear speakers. Not only is this great for traveling, it should help break down language barriers and bring people closer together. We plan to add support for more languages in the future to make this feature even more useful. We're partnering withBe My Eyes, a free app that connects blind and low-vision people with sighted volunteers so they can talk you through what's in front of you. Thanks to the glasses and POV video calling, the volunteer can easily see your point of view and tell you about your surroundings or give you some real-time, hands-free assistance with everyday tasks, like adjusting the thermostat or sorting and reading mail. In addition, we're advancing our integrations withSpotifyandAmazon Music, and adding new partnerships withAudibleandiHeart. You can use your voice to search, discover and play content on the go. Ask to play by song, artist, album, or audiobook. And you can get more information about the content your glasses are playing ("Hey Meta, what album is this from?"). [Link]
Share
Share
Copy Link
Meta has announced a range of new AI-driven features for its Ray-Ban smart glasses, including live translation, multi-modal AI, and enhanced video capabilities. These updates aim to make the glasses more versatile and useful in everyday life.
Meta's Ray-Ban smart glasses are set to receive a groundbreaking update with the introduction of live translation capabilities. This feature will allow users to have real-time conversations with people speaking different languages, breaking down communication barriers 1. Additionally, the glasses will offer a language learning mode, helping users practice and improve their language skills in real-world scenarios 2.
The smart glasses will now feature multi-modal AI, enabling users to interact with their surroundings more intuitively. Users can ask questions about objects in their field of view, and the AI will provide relevant information 3. Meta AI will be integrated into the glasses, allowing for more natural conversations and assistance during live video streaming 4.
Meta is upgrading the video features of the Ray-Ban smart glasses. Users will now be able to stream live video directly from their glasses to platforms like Facebook and Instagram. The new "see what I see" feature allows for more immersive sharing experiences 1. The glasses will also support ultra-wide field of view video capture, providing a more comprehensive visual experience 3.
One of the most intriguing features is the AI-powered memory assistance. The glasses can now help users remember important information, such as where they parked their car or placed their keys. This feature utilizes the glasses' cameras and AI to create a searchable log of the user's experiences 2.
Meta has added QR code scanning functionality to the smart glasses, allowing users to quickly access information or websites by simply looking at a QR code. This feature enhances the glasses' utility in various scenarios, from shopping to accessing digital content 5.
These new features will be rolled out to existing Ray-Ban Meta smart glasses through software updates. The company has stated that most features will be available by fall 2024, with some rolling out earlier. The updates demonstrate Meta's commitment to enhancing the capabilities of its wearable technology, positioning the Ray-Ban smart glasses as a versatile tool for daily life 4.
Reference
[3]
Meta rolls out significant AI-powered updates to Ray-Ban Meta Smart Glasses, including real-time visual AI, live translation, and Shazam integration, enhancing user experience and functionality.
22 Sources
22 Sources
Meta's Ray-Ban smart glasses receive a significant AI update, introducing multimodal features that enhance user interaction and functionality, potentially revolutionizing the smart glasses market.
3 Sources
3 Sources
Ray-Ban Meta smart glasses are outselling traditional Ray-Bans in many stores, with new AI features rolling out globally. The glasses' success has led to an extended partnership between Meta and EssilorLuxottica.
4 Sources
4 Sources
Meta plans to incorporate small displays into Ray-Ban smart glasses by 2025, aiming to enhance functionality and challenge smartphone dominance. This move represents a significant step towards mixed reality technology in wearable devices.
10 Sources
10 Sources
Meta launches its AI-powered Ray-Ban smart glasses in several European countries, introducing multilingual support while navigating complex regulatory challenges.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved