3 Sources
[1]
Meta Is Upgrading One of the Worst Parts of Its Smart Glasses
Computer vision might be one of the defining features of Meta's smart glasses, but that doesn't mean it works like one. Having used Meta's Ray-Ban AI glasses at length, I can say with certainty that the ability to use the camera for identifying stuff in your surroundings is very hit or missâ€"sometimes it gets things right, but just as often it falls flat. According to Meta, an upgrade to that tumultuous experience may be on the way thanks to a new AI model. Muse Spark, which Meta officially unveiled this week, is what the company is calling the "first in a series" of large language models built by Meta's Superintelligence Labs. Though it's not out yet, Meta says it plans to integrate Muse Spark into quite a few of its products, including AI glasses, in the coming weeks. What exactly will it be doing there? Seeing stuff better, apparently. "When Meta AI powered by Muse Spark comes to our AI glasses, the assistant will be able to better see and understand the world around you," says Meta in a blog post. Lots of Muse Sparks' strengths, in fact, appear to be centered on "multimodal perception," which is shorthand for seeing stuff in your environment and being able to understand it. According to Meta, one of Muse Spark's strengths is health. "Meta AI is now able to help you navigate health questions with more detailed responses, including some questions involving images and charts," according to the company. It also says that Muse Spark excels at "visual coding," which allows people to "create custom websites and mini-games straight from a prompt." While it doesn't say either of those abilities is coming to AI glasses necessarily, it's worth noting that Meta has leaned into health recently, expanding the nutrition coaching capabilities on its AI glasses. Computer vision and AI are still two of the core pillars of Meta's smart glasses, so I'm going to assume that any way Meta can squeeze Muse Spark into the equation, it'll try. How impactful Muse Spark is on the experience of using Ray-Ban Meta AI glasses is still an open question, but even if it makes computer vision less prone to mistakes and hallucinations, it could be a significant improvement. Now, if they could just spend a little time getting their abysmal privacy standards in order...
[2]
Meta's New 'Personal Superintelligence' AI Is Coming to Its Smart Glasses
Meta claims the model matches its last model's performance while using over 10x less computing power. Calling it a step towards "super intelligence," Meta announced it is releasing Muse Spark, an overhauled and improved AI. This "natively multimodal reasoning model" goes way beyond a chatbot, and it will soon live in your glasses and your social feeds. It's available now in the Meta AI app, with plans to roll out with a smart glasses update in the next few weeks. Instead of a one-size-fits-all approach, there are three levels to Muse Spark's "thinking," and users will be able control how deep the intelligence goes. Meta says Muse Spark's performance compares to or exceeds their Llama 4 Maverick model while using over an order of magnitude less computing power. That means, theoretically, high-level reasoning without excessive server use. While Muse Sparks will be accessible in a variety of places, it seems like Muse Spark's ground-up integration of visual material is made for smart glasses. Here are some of the ways Ray-Ban Meta and Oakley Meta users will be able to use the new AI. One of the Muse Spark main improvements over Meta's previous model is the way the new AI will integrate visual information across different tools. So, theoretically, you could point your glasses at a mess of wires and electronic boxes and say "how do I hook up this home theater system?" Or get step-by-step coaching on assembling a piece of IKEA furniture without opening the booklet. The AI would read the instructions and make sure you're not screwing anything in upside down. Meta said its Meta Superintelligence Lab collaborated with over 1,000 physicians to develop the AI's health reasoning capabilities. Users will be able to do things like generate an interactive display that unpacks the nutritional information about food, and maps out what muscles are activated during a workout. All of the above is "in theory." Artificial intelligence hasn't always lived up to its hype, even when it's being hyped in front of a massive audience. It's one thing to perform well in laboratory benchmark tests, but how the tech works in the real world, where lighting is spotty, wi-fi is slow, and furniture instructions can be extremely complicated, is the real challenge. While I haven't dug deeply into the tech, I did give it a quick test by turning on "thinking" mode and sending Meta AI the below picture of a random assortment of audio gear: It not only correctly identified everything in the picture, it gave me a couple different options for possible ways to hook it together, and told me (correctly) what cords I needs. So I look forward to having it on my glasses. If you want to test it yourself, Muse Spark is already running on meta.ai and the Meta AI app, and smart glasses firmware and social media integrations are expected to follow shortly.
[3]
Meta Rolls Out Muse Spark AI Upgrade for Ray-Ban and Oakley Glasses
Meta's $499 Ray-Ban Smart Glasses Bring AI to Prescription Wearers Meta introduced Muse Spark as the first model from its Superintelligence Labs. The company describes it as a "natively multimodal" system that can process different types of input in one model. In practical use, Meta says this should help its smart glasses better understand what users are looking at and respond with more accurate answers. The company said Muse Spark will roll out to and Oakley smart glasses in the United States in the coming weeks. Meta also plans to add the model to Facebook, Messenger, and WhatsApp. The broader rollout shows that the company is tying its wearable devices more closely to its AI products across its apps and services.
Share
Copy Link
Meta unveiled Muse Spark, its first model from Superintelligence Labs, designed to dramatically improve computer vision on Ray-Ban and Oakley smart glasses. The natively multimodal AI uses 10x less computing power than previous models while delivering better visual understanding. The upgrade rolls out to U.S. users in coming weeks.
Meta AI is getting a significant upgrade with Muse Spark, the first model released by Meta's Superintelligence Labs, designed to address one of the most frustrating aspects of the company's smart glasses: unreliable computer vision capabilities
1
. The personal superintelligence AI represents a fundamental shift in how Meta smart glasses process and understand visual information, moving from inconsistent performance to what the company promises will be substantially more accurate environmental recognition2
.
Source: Analytics Insight
The multimodal reasoning model will roll out to Ray-Ban and Oakley glasses in the United States within the coming weeks, alongside integration into Facebook, Messenger, and WhatsApp
3
. This broader deployment across Meta's AI ecosystem signals the company's strategy to unify wearable technology with its core social media integrations.
Source: Lifehacker
What sets Muse Spark apart is its efficiency. Meta claims the natively multimodal AI matches the performance of its Llama 4 Maverick model while consuming over 10x less computing power
2
. This dramatic reduction in resource requirements could enable more sophisticated on-device processing without draining batteries or overwhelming servers. The model offers three levels of "thinking," allowing users to control how deeply the AI analyzes user context based on their needs.Meta's Superintelligence Labs collaborated with over 1,000 physicians to develop the AI's health reasoning capabilities
2
. Users wearing Ray-Ban and Oakley glasses will be able to generate interactive nutritional displays about food and receive detailed information about muscle activation during workouts. The company has already expanded nutrition coaching features on its AI glasses, and Muse Spark builds on this foundation.The model also excels at visual coding, enabling users to create custom websites and mini-games directly from prompts
1
. While Meta hasn't explicitly confirmed all these capabilities will come to smart glasses, the natively multimodal design suggests visual features are central to the upgrade.Related Stories
In practical testing, Muse Spark demonstrated impressive accuracy. When presented with an image of various audio equipment, the AI correctly identified each component, suggested multiple connection options, and specified the required cables
2
. This capability translates to real-world scenarios like receiving step-by-step guidance for assembling IKEA furniture or troubleshooting home theater wiring just by looking at the setup.
Source: Gizmodo
"When Meta AI powered by Muse Spark comes to our AI glasses, the assistant will be able to better see and understand the world around you," Meta stated in its blog post
1
. The company emphasizes that multimodal perception—seeing and comprehending environmental elements—sits at the core of Muse Spark's strengths.Whether Muse Spark can deliver on its promises in messy real-world conditions remains to be seen. Laboratory benchmark tests don't always translate to environments with poor lighting, slow Wi-Fi, or complex instructions
2
. The technology must prove it can reduce hallucinations and errors that have plagued previous iterations.Privacy standards also remain a concern that Meta has yet to adequately address
1
. As the AI upgrade for Ray-Ban becomes more sophisticated at analyzing visual information, questions about data collection and user privacy will intensify. The integration across Facebook, Messenger, and WhatsApp means visual data from glasses could potentially flow through Meta's entire platform ecosystem.For now, users can test Muse Spark through meta.ai and the Meta AI app, with smart glasses firmware updates expected shortly
2
.Summarized by
Navi
[3]
1
Policy and Regulation

2
Entertainment and Society

3
Technology
