Curated by THEOUTPOST
On Fri, 4 Oct, 12:02 AM UTC
3 Sources
[1]
After using Meta's Ray-Ban glasses, I'm willing to bet AI is the key to their future
On top of Oiron -- Meta's prototype for the world's first pair of full-featured holographic glasses -- it also announced a major update to its Meta Ray-Ban glasses last week during Connect. Instead of telling you what's new, I'll give you one guess what the biggest addition to the Meta Ray-Ban smart glasses was... If your answer is AI, congratulations, you won tech's easiest guessing game. And sure, it's no surprise that Meta, like pretty much every tech company, is introducing AI into its glasses, but just because the move is expected doesn't mean it's not significant. AI may not be appropriate for every product, but it could be an absolute game-changer for smart glasses. In case you haven't noticed, AI is showing up everywhere: in Google search, in Photoshop, and in your email inbox. Naturally, some of those applications aren't going to pan out -- right now, there's a "throw AI at the wall and see if it sticks" mentality. But some of those AI uses will stick, and after briefly using Meta's Ray-Ban glasses, I'm willing to bet smart glasses are one of those arenas. There are many hurdles to making smart glasses -- miniaturization is one that comes to mind -- but devising a new input method will be among the top priorities. If smart glasses are to be the "next thing" after smartphones, they'll need to be fully featured but also as intuitive to use as our beloved glass slabs. And they'll need a UI that people understand to be easy to use. The problem? Well, there is no touchscreen on a pair of smart glasses. In the future, smart glasses could adopt some hand/eye tracking input similar to the Apple Vision Pro, but right now, that tech is nowhere near viable in a form factor so small. For now, we have one thing -- voice assistants. Meta has already tapped that potential since its newest Ray-Ban update comes pre-loaded with Meta AI. I got a brief chance to use Meta AI at a recent event, and though it's still very much a work in progress, I can see the appeal. The infusion of large language models (LLMs) promises to make voice assistants much more savvy. By nature, LLMs are better at understanding natural language prompts and are capable of executing more complex, multi-step commands. That means, for example, you can put on your Ray-Ban smart glasses, look at a recipe written in French, and say, "Hey Meta, translate this recipe for me." I actually did this in Meta's demo space. The results were maybe not perfect -- I had to really focus my vision on the card, and the AI didn't quite hear my prompt in a loud room -- but with some tinkering, we got there. With a more capable, AI-supercharged voice assistant, smart glasses might not immediately need a more fine-tuned, complex UI. Want to launch music? Take a picture? Set a reminder or an alarm? Just shout. Meta AI is already promising to go beyond the banal alarm-setting and app-launching of our current crop of voice assistants. For instance, Meta AI can identify objects in your field of view and give you information about them. That means you could have your "what are those?!" moment with a pair of shoes or maybe look at a piece of art and have your glasses give you more background on who made it. It's still early days, but this kind of neural connection between computer vision and our neverending stream of information on the web feels genuinely novel but also (with refinement) actually practical. There's a lot to do between now and our Orion future, but Meta AI will clearly be a stepping stone. In the same way that Meta AI promises to simplify UI on headsets like the Quest, it also stands to pave the way for early-stage smart glasses like the Meta Ray-Bans. I'm not sold on the usefulness of LLMs, but if there's one thing they're exceptional at, it's understanding us. It's taken a long time for voice assistants to mature, but with LLMs, we could finally get there. Both Amazon and Apple have promised far more functional versions of their Alexa and Siri voice assistants, respectively, and Meta's AI has already started imbuing some of that LLM firepower. AI might not open the floodgates for smart glasses, but it might just convince us that they're legitimately worth our investment.
[2]
Meta Ray-Bans' New AI Camera Features Are Arriving Now
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps Meta sees the future of smart glasses as being always-ready AI assistants. Its Meta Ray-Bans have already been pushing forward in that direction, and a number of features announced a week ago by Mark Zuckerberg that I tried out on Meta's campus are already becoming available starting today. The extra camera-based AI features look like they'll be further blurring the line between AI requests and the glasses' proactive use of the camera. A new app and firmware update rolling out now promises a more natural set of requests that the glasses will respond to for taking photos. Instead of just "look" as a general trigger for the glasses to take a photo. As I tried already, you could just ask about something in front of you and the glasses could use that as a trigger to take a photo. The glasses will also recognize QR codes and open them on your phone, and make phone calls based on a phone number that's seen by the camera (if you ask). A reminder feature can be used to jog some memories later - one that can remember where you've parked isn't something I've tried yet, but am curious how it works. It's the start of Meta's push into using future AR glasses as assistive memory devices. The update also can allow the glasses to record and send voice messages on Messenger and WhatsApp, but the improved music controls I tried at Connect aren't here yet. Neither is the live translation feature Mark Zuckerberg showed off on-stage, or the AI assistant feature that works while recording live video. The future of Meta's AI-based camera features is clearly going to grow, and privacy questions will grow along with them. A group of students recently found a way to use the Ray-Bans to identify faces by relaying the photos via Instagram to another AI tool, and with Meta opening up camera access on Quest headsets to developers next year, the future of advanced camera-based AI may start heading in surprising directions fast.
[3]
New Meta Ray-Ban AI features roll out, making the smart glasses even more tempting
The Meta Ray-Ban's host of new multimodal AI features that were announced last week will start rolling out today in software updates. At the Meta Connect event last week, Mark Zuckerberg showed off many new features on the company's flagship Meta Ray-Ban smart glasses. Calling the glasses "the perfect form factor for AI," the new quality-of-life improvements center around the glasses' multi-modal AI for a more natural interaction (similar to what we saw with Google's Gemini and ChatGPT 4o). Many of the new features that were announced will start to roll out today, including the glasses' much-hyped ability to "remember" things for you. In an Instagram post yesterday, Mark Zuckerberg showed off this new feature while looking hopelessly lost in a parking garage. He asks the glasses to remind him where he parked, to which it cheerfully replies, "You parked in spot 304." Also: Meta Ray-Ban Smart Glasses review: The best AI-powered AR glasses to buy right now If you already have the glasses, you just need to make sure you have the latest version of the Meta View app (version 186) to start getting the new features in rolling updates, with the features said to be coming online soon. With the new features going live soon, users can harness the multimodal AI for some promising new interactions, including a new translation feature, the ability for the glasses to "see" and "remember" things like where you parked, phone numbers or business hours, and scan QR codes. Here's a rundown of all the new features rolling out today. Similar to other live translation technologies we've seen emerge this year, the Meta Ray-Bans will be getting a live translation feature designed to work in real-time (or at least close to it) with Spanish, French, and Italian. During Meta Connect, Zuckerberg demonstrated a conversation with a Spanish speaker, and the glasses translated what each speaker said and heard from Spanish into English in just seconds between the lines. Of course, not every conversation will involve two users wearing smart glasses, so the company is allowing users to sync their output with the Meta companion app, leveraging the smartphone to display translations. Also: Google's AI podcast tool transforms your text into stunningly lifelike audio - for free In addition to the glasses' new features, Meta also teased its new translation AI tool for Instagram Reels that automatically translates audio into English and then uses AI to sync the speaker's mouth movements to match the English translation. The result -- in the demo at least -- was a natural-looking video in English using the speaker's own voice sample. So far, this feature is in its early stages, and only available in Spanish for now on Instagram and Facebook while Meta continues to test the technology. Along with Zuck's recent Instagram post, the demo also showed off the glasses' "photographic memory" by solving a problem we've all had: remembering where we parked. The user looked at the number on the parking spot and simply said, "Remember where I parked." Later, asking the glasses, "Hey Meta, where did I park?" invoked the AI to respond with the parking space number. This kind of "filing away" of knowledge on the fly is an example of utilizing what the AI is best at: recalling specific data in a pre-defined context. We'll have to test ourselves how reliable the feature will be for less visually hinted information. Additional usability examples of this feature are easy to imagine, looking at anything from grocery lists to event dates or phone numbers. Previously, you'd have to say "Hey Meta" to invoke the glasses' AI, then wait for the prompt to begin your inquiry. Now, you just need to say it once to kick off the initial conversation, and you won't have to keep repeating it with follow-up questions. Also, since the multimodal AI is "always on", you won't need to specifically tell the glasses to "look at" a specific object. Also: Meta's new 512GB Quest 3 deal may be the best VR headset offer right now One demo showed a user peeling an avocado and asking, "What can I make with these?", not specifying what "these" referred to. Another demo showed a user searching through a closet and pulling out multiple items of clothing at once, asking the AI to help style an outfit in real time. Like how other popular voice assistants have developed, you can always interrupt Meta AI when converting with it. Along the same lines, the multimodal capabilities of the glasses extend beyond simply analyzing what's in view in a static sense. The glasses will recognize things like URLs, phone numbers, which you can call, or QR codes, which you can scan instantly with the glasses. Lastly, Zuckerberg demoed a clever new accessibility feature of the glasses. Blind and vision-impaired people can use the glasses to broadcast what they see to a volunteer on the other end, who can talk them through the details of what they're looking at. Be My Eyes is an already-existing program that connects vision-impaired folks with virtual volunteers through live video. The demo showed a woman looking at a party invitation with dates and times. Still, real-world uses for this could essentially be anything from reading signs to shopping for groceries to navigating a tech gadget. Also: Google co-founder on the future of AI wearables (and his Google Glass regrets) Finally, Zuck showed off some new designs, including a new, limited edition of the Ray-Bans with clear, transparent frames, as well as the introduction of new transition lenses, effectively doubling their usability as both sunglasses and prescription glasses. The Meta Ray-Bans start at $300 and come in nine different frame designs and a new limited-edition transparent style. The new updates are rolling out today for both Android and iOS users with the most recently-updated version of the Meta View app.
Share
Share
Copy Link
Meta's Ray-Ban smart glasses receive a significant AI update, introducing multimodal features that enhance user interaction and functionality, potentially revolutionizing the smart glasses market.
Meta has rolled out a significant update to its Ray-Ban smart glasses, introducing a suite of AI-powered features that promise to revolutionize the wearable tech landscape. The update, which began rolling out recently, brings multimodal AI capabilities to the glasses, enhancing user interaction and functionality 123.
The new update introduces several AI-driven capabilities:
Natural Language Interaction: Users can now interact with the glasses more naturally, without repeatedly using wake words like "Hey Meta" for follow-up questions 3.
Visual Recognition and Translation: The glasses can now identify objects in the user's field of view and provide information about them. Additionally, they offer real-time translation capabilities for Spanish, French, and Italian 13.
Memory Assistance: A new feature allows users to "remember" information, such as parking locations, by simply asking the glasses to store the information for later recall 23.
QR Code and Phone Number Recognition: The glasses can now recognize and process QR codes and phone numbers visible to the camera 23.
Improved Photo Capture: Users can now trigger photo capture more naturally by asking about objects in view, rather than using specific commands 2.
Industry experts believe that AI integration could be a game-changer for smart glasses. The infusion of large language models (LLMs) is expected to make voice assistants more capable of understanding natural language prompts and executing complex, multi-step commands 1.
Mark Zuckerberg, CEO of Meta, referred to smart glasses as "the perfect form factor for AI," highlighting the company's vision for the future of wearable technology 3.
As AI capabilities in smart glasses expand, privacy concerns are likely to grow. Recent incidents have already highlighted potential misuse, such as students using the glasses to identify faces through Instagram 2.
Meta plans to open up camera access on Quest headsets to developers next year, which could lead to rapid advancements in camera-based AI applications 2.
The update also introduces new accessibility features. For instance, visually impaired users can use the glasses to broadcast their view to volunteers who can provide real-time assistance 3.
Other practical applications include live translation during conversations, outfit styling assistance, and instant information retrieval about objects in view 13.
The Meta Ray-Ban smart glasses start at $300 and are available in nine different frame designs, including a new limited-edition transparent style. The AI features are being rolled out through software updates for both Android and iOS users 3.
As AI continues to evolve, it's clear that smart glasses like Meta's Ray-Bans are poised to become more than just a novelty, potentially offering a glimpse into the future of how we interact with technology in our daily lives.
Meta has announced a range of new AI-driven features for its Ray-Ban smart glasses, including live translation, multi-modal AI, and enhanced video capabilities. These updates aim to make the glasses more versatile and useful in everyday life.
16 Sources
16 Sources
Ray-Ban Meta smart glasses are outselling traditional Ray-Bans in many stores, with new AI features rolling out globally. The glasses' success has led to an extended partnership between Meta and EssilorLuxottica.
4 Sources
4 Sources
Meta rolls out significant AI-powered updates to Ray-Ban Meta Smart Glasses, including real-time visual AI, live translation, and Shazam integration, enhancing user experience and functionality.
22 Sources
22 Sources
Meta CEO Mark Zuckerberg envisions a future where AI-powered smart glasses become the primary personal computing device. He believes this transition could happen within the next few years, but challenges remain.
3 Sources
3 Sources
Meta's Ray-Ban smart glasses combine AI capabilities with stylish design, offering features like hands-free photography, AI assistance, and audio playback. While current models have limitations, future versions promise more advanced AR functionality.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved