2 Sources
2 Sources
[1]
Meta Is Upgrading One of the Worst Parts of Its Smart Glasses
Computer vision might be one of the defining features of Meta's smart glasses, but that doesn't mean it works like one. Having used Meta's Ray-Ban AI glasses at length, I can say with certainty that the ability to use the camera for identifying stuff in your surroundings is very hit or missâ€"sometimes it gets things right, but just as often it falls flat. According to Meta, an upgrade to that tumultuous experience may be on the way thanks to a new AI model. Muse Spark, which Meta officially unveiled this week, is what the company is calling the "first in a series" of large language models built by Meta's Superintelligence Labs. Though it's not out yet, Meta says it plans to integrate Muse Spark into quite a few of its products, including AI glasses, in the coming weeks. What exactly will it be doing there? Seeing stuff better, apparently. "When Meta AI powered by Muse Spark comes to our AI glasses, the assistant will be able to better see and understand the world around you," says Meta in a blog post. Lots of Muse Sparks' strengths, in fact, appear to be centered on "multimodal perception," which is shorthand for seeing stuff in your environment and being able to understand it. According to Meta, one of Muse Spark's strengths is health. "Meta AI is now able to help you navigate health questions with more detailed responses, including some questions involving images and charts," according to the company. It also says that Muse Spark excels at "visual coding," which allows people to "create custom websites and mini-games straight from a prompt." While it doesn't say either of those abilities is coming to AI glasses necessarily, it's worth noting that Meta has leaned into health recently, expanding the nutrition coaching capabilities on its AI glasses. Computer vision and AI are still two of the core pillars of Meta's smart glasses, so I'm going to assume that any way Meta can squeeze Muse Spark into the equation, it'll try. How impactful Muse Spark is on the experience of using Ray-Ban Meta AI glasses is still an open question, but even if it makes computer vision less prone to mistakes and hallucinations, it could be a significant improvement. Now, if they could just spend a little time getting their abysmal privacy standards in order...
[2]
Meta's New 'Personal Superintelligence' AI Is Coming to Its Smart Glasses
Meta claims the model matches its last model's performance while using over 10x less computing power. Calling it a step towards "super intelligence," Meta announced it is releasing Muse Spark, an overhauled and improved AI. This "natively multimodal reasoning model" goes way beyond a chatbot, and it will soon live in your glasses and your social feeds. It's available now in the Meta AI app, with plans to roll out with a smart glasses update in the next few weeks. Instead of a one-size-fits-all approach, there are three levels to Muse Spark's "thinking," and users will be able control how deep the intelligence goes. Meta says Muse Spark's performance compares to or exceeds their Llama 4 Maverick model while using over an order of magnitude less computing power. That means, theoretically, high-level reasoning without excessive server use. While Muse Sparks will be accessible in a variety of places, it seems like Muse Spark's ground-up integration of visual material is made for smart glasses. Here are some of the ways Ray-Ban Meta and Oakley Meta users will be able to use the new AI. One of the Muse Spark main improvements over Meta's previous model is the way the new AI will integrate visual information across different tools. So, theoretically, you could point your glasses at a mess of wires and electronic boxes and say "how do I hook up this home theater system?" Or get step-by-step coaching on assembling a piece of IKEA furniture without opening the booklet. The AI would read the instructions and make sure you're not screwing anything in upside down. Meta said its Meta Superintelligence Lab collaborated with over 1,000 physicians to develop the AI's health reasoning capabilities. Users will be able to do things like generate an interactive display that unpacks the nutritional information about food, and maps out what muscles are activated during a workout. All of the above is "in theory." Artificial intelligence hasn't always lived up to its hype, even when it's being hyped in front of a massive audience. It's one thing to perform well in laboratory benchmark tests, but how the tech works in the real world, where lighting is spotty, wi-fi is slow, and furniture instructions can be extremely complicated, is the real challenge. While I haven't dug deeply into the tech, I did give it a quick test by turning on "thinking" mode and sending Meta AI the below picture of a random assortment of audio gear: It not only correctly identified everything in the picture, it gave me a couple different options for possible ways to hook it together, and told me (correctly) what cords I needs. So I look forward to having it on my glasses. If you want to test it yourself, Muse Spark is already running on meta.ai and the Meta AI app, and smart glasses firmware and social media integrations are expected to follow shortly.
Share
Share
Copy Link
Meta announced Muse Spark, a new AI model designed to dramatically improve the computer vision capabilities of its Ray-Ban AI glasses. The upgrade promises better visual understanding and health reasoning while using over 10x less computing power than previous models. The rollout begins in the Meta AI app, with smart glasses integration expected within weeks.
Meta has officially unveiled Muse Spark, calling it the first in a series of language models developed by the company's Meta Superintelligence Lab
1
. The new AI model directly addresses one of the most frustrating limitations of Meta's Ray-Ban AI glasses: inconsistent computer vision that often fails to accurately identify objects and scenes in real-world environments1
. According to Meta, the upgrade will enable the assistant to "better see and understand the world around you" when integrated into smart glasses in the coming weeks1
.
Source: Lifehacker
What sets Muse Spark apart is its foundation as a "natively multimodal reasoning model" that processes visual information from the ground up, rather than treating it as an afterthought
2
. Meta claims the model matches the performance of its Llama 4 Maverick model while consuming over an order of magnitude less computing power, theoretically enabling high-level reasoning without excessive server demands2
. This efficiency gain could prove critical for wearable devices where battery life and response time matter.The strength of Muse Spark lies in its multimodal perception capabilities, which allow it to process and understand visual information alongside text and voice commands
1
. Users wearing Ray-Ban Meta or Oakley Meta glasses could point their device at complex scenarios—like a tangle of home theater wires and electronic boxes—and receive step-by-step guidance on proper setup2
. The AI model can read instructions and provide real-time verification, ensuring users don't make assembly mistakes with furniture or equipment2
.
Source: Gizmodo
Meta AI powered by Muse Spark also introduces three levels of "thinking," giving users control over how deeply the intelligence processes their requests
2
. This tiered approach represents a shift from one-size-fits-all AI interactions toward more nuanced, context-appropriate responses.The Meta Superintelligence Lab collaborated with over 1,000 physicians to develop Muse Spark's health reasoning capabilities
2
. This focus builds on Meta's recent expansion of nutrition coaching features in its AI glasses1
. Users will be able to generate interactive displays that break down nutritional information about food and map which muscles activate during specific workouts2
. Meta states that the assistant can now "help you navigate health questions with more detailed responses, including some questions involving images and charts"1
.The visual coding capability represents another dimension of Muse Spark's versatility, allowing people to create custom websites and mini-games directly from prompts
1
. While Meta hasn't confirmed whether this specific feature will reach smart glasses, the company appears committed to integrating Muse Spark capabilities wherever feasible1
.Related Stories
Early testing shows promise. When presented with a random assortment of audio gear, Muse Spark correctly identified all components, suggested multiple connection options, and specified the necessary cables . However, the true measure will be performance in real-world conditions where lighting varies, wi-fi connections fluctuate, and instructions grow complex . Artificial intelligence has frequently fallen short of its hype, even during high-profile demonstrations .
If Muse Spark can reduce mistakes and hallucinations in computer vision applications, it could mark a significant improvement for Meta's wearable AI strategy
1
. The model is already available through meta.ai and the Meta AI app, with smart glasses firmware updates and social media integrations expected shortly . As Meta positions this as a step toward what it calls personal superintelligence, questions about privacy standards for these increasingly capable devices remain unresolved1
.Summarized by
Navi
1
Technology

2
Technology

3
Science and Research
