10 Sources
10 Sources
[1]
Meta's AI glasses can now help you hear conversations better | TechCrunch
Meta announced on Tuesday an update to its AI glasses that will allow you to better hear people talking when you're in a noisy environment. The feature will initially become available on Ray-Ban Meta and Oakley Meta HSTN smartglasses in the U.S. and Canada, the company says. In addition, the glasses are getting another update that lets you use Spotify to play a song that matches what's in your current view. For instance, if you're looking at an album cover, the glasses could play a song by that artist. Or if you're looking at your Christmas tree with a pile of gifts, you could play holiday music. This addition is more of a gimmick, of course, but it demonstrates how Meta is thinking about connecting what people see with actions they can take in their apps. The conversation-focus feature, meanwhile, seems more practical. First announced at Meta's Connect conference earlier this year, the feature uses the AI glasses' open-ear speakers to amplify the voice of the person you're talking to. Meta says smartglasses wearers will also be able to adjust the amplification level by swiping the right temple of their glasses, or via the device settings. This will allow them to set the level more precisely to match their current environment, whether that's a busy restaurant or bar, club, commuter train, or anything else. How well the feature works, of course, will still need to be tested. However, the idea of using smart accessories as tools to help with hearing isn't limited to Meta. Apple's AirPods already offer a Conversation Boost feature designed to help you focus on the person you're talking to, and the Pro models more recently added support for a clinical-grade Hearing Aid feature as well. While the conversation-focus feature is limited to the U.S. and Canada, the Spotify feature is offered in English in a larger number of markets, including Australia, Austria, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, India, Ireland, Italy, Mexico, Norway, Spain, Sweden, the United Arab Emirates, the U.K., and the U.S. The software update (v21) will first become available to those who are enrolled in Meta's Early Access Program, which requires first joining a waitlist and being approved. It will later roll out more broadly.
[2]
Meta's Glasses Can Focus in on Your Conversations With a New Update
Nearly 20 years writing about tech, and over a decade reviewing wearable tech, VR, and AR products and apps Sorry, can you say that again? Meta's smart glasses are introducing a new hearing assist function called Conversation Focus that, in noisy places, could make it easier to hear whoever's talking to you. It's part of a year-end update to Meta's smart glasses lineup that was just announced today. Conversation Focus was promised in September at Meta's glasses-focused developer conference. And I heard it demoed then, too. The feature works using the glasses' beam-forming microphones to directionally focus on whoever's in front of you, and then filter out other noise. The idea is similar in theory to what an FDA-approved pair of hearing assistance glasses made by Nuance Audio does. The update should be arriving for Meta's Ray-Ban and Oakley HSTN smart glasses, but it's available as an Early Access feature you have to opt into first. The V21 update also adds a new camera-enabled AI feature with Spotify that launches playlists inspired by whatever you're looking at. Google has a similar feature coming for its glasses next year. Meta's also adding voice command shortcuts and extra programmable workouts using Garmin watches with their Oakley Vanguard glasses. But Meta's rolling firmware updates sometimes take some time to reach everyone, so there's a chance you may not see it today at all. I'm curious to give Conversation Focus a go soon, though, if not while wandering around the crowds at CES.
[3]
Meta rolls out AI glasses update with conversation boosting feature
Back at Meta Connect, Meta teased that its AI glasses would soon support a conversation boosting feature to help folks hear better in noisy environments. That feature is rolling out starting today for Ray-Ban Meta and Oakley Meta HSTN glasses owners who've opted into the company's Early Access Program. The feature, dubbed Conversation Focus, is straightforward. Using the glasses' directional mics, it amplifies the voice of the person you're speaking to. You can also adjust it by swiping the right arm of the glasses or in device settings. This isn't overtly an accessibility feature, but it's a nice update for folks who use these glasses primarily as headphones. Also rolling out is a new Spotify integration with Meta AI... so you can have Meta AI play songs based on what you're looking at. The company's blog ties this in with the holiday season -- as in you can prompt Meta to play festive tunes if you're looking at a Christmas tree by saying something like "Hey Meta, start a playlist that matches this environment." (Or, you could just ask it to play a holiday playlist.) It might sound silly, but to be fair, Google also recently demoed a similar feature for me at a sneak peek of Project Aura and Android XR prototype glasses.
[4]
Meta is rolling out Conversation Focus and AI-powered Spotify features to its smart glasses
Back in September during Meta Connect, the company previewed a new ability for its smart glasses lineup called Conversation Focus. The feature, which is able to amplify the voices of people around you, is now starting to roll out in the company's latest batch of software updates. When enabled, the feature is meant to make it easier to hear the people you're speaking with in a crowded or otherwise noisy environment. "You'll hear the amplified voice sound slightly brighter, which will help you distinguish the conversation from ambient background noise," Meta explains. It can be enabled either via voice commands ("hey Meta, start Conversation Focus") or by adding it as a dedicated "tap-and-hold" shortcut. Meta is also adding a new multimodal AI feature for Spotify. With the update, users can ask their glasses to play music on Spotify that corresponds with what they're looking at by saying "hey Meta, play a song to match this view." Spotify will then start a playlist "based on your unique taste, customized for that specific moment." For example, looking at holiday decor might trigger a similarly-themed playlist, though it's not clear how Meta and Spotify may translate more abstract concepts into themed playlists. Both updates are starting to roll out now to Meta Ray-Ban glasses (both Gen 1 and Gen 2 models), as well as the Oakley Meta HSTN frames. The update will arrive first to those enrolled in Meta's early access program, and will be available "gradually" to everyone else. Meta's newest mode of smart glasses, the Oakley Meta Vanguard shades, are also getting some new features in the latest software update. Meta is adding the option to trigger specific commands with a single word, rather than having to say "hey Meta." For example, saying "photo" will be enough to snap a picture and saying "video" will start a new recording. The company says the optional feature is meant to help athletes "save some breath" while on a run or bike ride.
[5]
Meta Ushers Spotify Integration, Kannada & Telugu Support, Noise Filtration to AI Glasses | AIM
Meta's v21 update adds voice amplification, multimodal music via Spotify, and Telugu and Kannada support, deepening its AI glasses push in India. Meta has rolled out a new software update for its AI-powered smart glasses, improving audio quality through noise filtration and enhancing the audio streaming experience with Spotify using multimodal AI. The v21 update introduces "Conversation Focus," a feature that amplifies a speaker's voice in noisy environments. First announced at Meta Connect earlier this year, the feature uses open-ear speakers built into Ray-Ban Meta and Oakley Meta HSTN glasses to enhance speech clarity. The system selectively amplifies the voice of the person a user is speaking with, helping distinguish conversation from background noise in places such as busy restaurants, trains, or crowded events. Users can adjust amplification levels directly from the glasses or through device settings, depending on their surroundings. The update also introduced Meta's first multimodal AI music experience in partnership with Spotify. By combining on-device vision with Spotify's personalisation engine, users can ask Meta AI to play music that matches what they are looking at, blending visual context with individual listening preferences to create moment-specific soundtracks. In a move that strengthens its India-focused AI strategy, Meta has added Telugu and Kannada language support to Ray-Ban Meta and Oakley Meta HSTN glasses. The rollout enables fully hands-free interaction with Meta AI in two additional regional languages, making the devices more accessible and natural to use for millions of users across the country. With this addition, Meta AI's multilingual footprint in India now extends beyond English and Hindi, reflecting a broader push to localise AI-powered wearables for diverse linguistic communities. The new features are rolling out gradually, starting with users enrolled in Meta's Early Access Programme, with wider availability expected over time.
[6]
Meta AI Glasses Get Two Sweet New Features
If you didn't hear about this, the AI glasses amplify the voice of the person you're talking to, helping you distinguish the conversation from other noises that might be around. You can adjust the volume of the conversation by swiping on the right temple of the glasses or from device settings. Also hitting eyewear via the Early Access Program is a new Spotify feature. Once enabled, you can say, "Hey Meta, play a song to match this view." Using computer vision and recognition coupled with Spotify, the music streaming service will create a playlist based on your taste and what it is you're looking at. If you're looking at an album cover while shopping or staring down holiday scenery, Spotify is sure to have a playlist for you. Each of these things are rolling out now inside of the Early Access Program, so if you haven't already signed up, you can do so by following the link below.
[7]
Meta glasses now play Spotify songs based on what you see
Meta announced an update to its AI glasses on Tuesday that enhances conversation hearing in noisy environments and adds a Spotify integration matching songs to visual views. The conversation feature launches initially on Ray-Ban Meta and Oakley Meta HSTN smart glasses in the U.S. and Canada, while the Spotify function expands to additional markets. The conversation-focus feature employs the glasses' open-ear speakers to amplify the voice of the person the wearer is speaking with. This capability targets situations where background noise interferes with clear communication. Meta first introduced the feature at its Connect conference earlier this year. Wearers activate and control it through specific interactions with the device. Users adjust the amplification level by swiping the right temple of the glasses or accessing the device settings directly. This flexibility enables precise calibration suited to varying surroundings. Examples of applicable settings include a busy restaurant filled with chatter and clinking dishes, a crowded bar with overlapping voices, a pulsating club environment with loud music, or a commuter train rumbling with announcements and passenger murmurs. Such adjustments ensure the amplified voice stands out against ambient sounds in these specific scenarios. The performance of this conversation-focus feature requires real-world testing to determine its reliability across diverse conditions. Comparable technologies appear in other devices. Apple's AirPods include a Conversation Boost function that prioritizes the speaker in front of the user. AirPods Pro models incorporate a clinical-grade Hearing Aid capability, providing structured auditory assistance approved for medical use. A separate update integrates Spotify functionality, allowing the glasses to play a song aligned with the wearer's current view. For example, viewing an album cover triggers playback of a track by the associated artist. Similarly, looking at a Christmas tree surrounded by gifts prompts holiday music selection. This connection links visual input captured by the glasses directly to audio actions within the Spotify application. The Spotify feature operates in English across an extensive list of markets: The software update, designated as version 21, deploys initially to participants in Meta's Early Access Program. Enrollment demands joining a waitlist followed by company approval. Subsequent phases extend availability to a wider user base possessing compatible Ray-Ban Meta or Oakley Meta HSTN glasses.
[8]
Meta AI Glasses Learn To Focus, Listen, And DJ Your Life - Meta Platforms (NASDAQ:META)
Meta Platforms Inc. (NASDAQ:META) is accelerating its push into artificial intelligence-powered wearables, updating its smart glasses and adjusting its broader hardware strategy to focus on smarter, more premium devices. Clearer Conversations With AI Glasses Meta is rolling out its v21 software update for Ray-Ban Meta and Oakley Meta AI glasses, adding features designed to improve everyday use. The additional features included clearer conversations in noisy places and a new Spotify Technology SA (NYSE:SPOT)-powered music experience. Also Read: Meta Is Betting You'll Soon Wear AI All Day With This New Buy The update introduces Conversation Focus, now available through Meta's Early Access Program in the U.S. and Canada, which amplifies the voice of the person you're speaking with while reducing background noise in settings like busy restaurants, trains, or live events. Users can adjust voice amplification by swiping the glasses' right temple or through device settings. Spotify-Powered Music That Matches the Moment Meta is also launching its first multimodal AI music feature in partnership with Spotify. By combining visual recognition with Spotify's personalization, users can ask Meta AI to play a song that matches what they're looking at, such as an album cover or a holiday scene, creating a moment-specific playlist hands-free. AI Glasses Take Priority As Hardware Strategy Shifts The software push coincides with Meta's plans to ramp up production of Ray-Ban Meta smart glasses, with annual production likely to reach 10 million units by 2026 end. Meta stock has gained just over 12% year-to-date amid investor concerns over its AI and the metaverse spending, with forecasts for higher capital expenditures for AI infrastructure overshadowing strong core business results. Growing Reality Labs' losses added to downward pressure. Reportedly, Meta plans to shift its virtual reality hardware toward higher-priced, more premium devices as it cuts back on metaverse spending and redirects resources toward AI-powered glasses, while competing with Apple Inc. (NASDAQ:AAPL) and Sony Group Corp (NYSE:SONY) in the high-end wearable market. Rising tariffs are pushing up costs, prompting Meta to slow new hardware launches and focus on better software, even as Quest sales soften and projects like its Phoenix mixed-reality glasses face delays. At the same time, Meta is reorganizing its AI strategy and leadership to counter growing competition from Apple's premium devices and Sony's PlayStation VR lineup. META Price Action: Meta Platforms shares were up 0.28% at $658.99 during premarket trading on Wednesday, according to Benzinga Pro data. Read Next: Want To 'Erase' Background Noise? Meta AI Now Lets You Do It Now Photo by Thrive Studios ID via Shutterstock METAMeta Platforms Inc$658.680.23%OverviewAAPLApple Inc$275.000.14%SONYSony Group Corp$26.25-0.76%SPOTSpotify Technology SA$587.951.53%Market News and Data brought to you by Benzinga APIs
[9]
Meta AI glasses get Conversation Focus, Spotify integration, and regional language support
Meta has released the v21 software update for its Ray-Ban Meta and Oakley Meta HSTN AI glasses, introducing features for improved audio, music interaction, and regional language support. The update introduces Conversation Focus, which amplifies the voice of the person a user is speaking with in noisy surroundings. The feature works through the open-ear speakers on the glasses, allowing users to distinguish conversation from background noise. The amplification level can be adjusted either by swiping the right temple or via device settings. This feature is suitable for environments such as busy restaurants, trains, or crowded events. v21 also adds a multimodal music feature in collaboration with Spotify. Users can request, "Hey Meta, play a song to match this view," and the glasses will suggest music based on what is being observed, such as an album cover or a scene. The system combines computer vision with Spotify's personalization engine to generate playlists tailored to the specific moment. Meta has expanded regional language support on its AI glasses to include Telugu and Kannada, adding to the previously available English and Hindi options. This update enhances multilingual access for users in India. The rollout follows the launch of Ray-Ban Meta Gen 2 and Oakley HSTN AI glasses in India, offering users options across lifestyle and performance categories. Meta says additional features and broader availability will roll out gradually.
[10]
Meta smart glasses get Conversation Focus and AI-powered Spotify controls
The update is available on Ray-Ban Meta and Oakley Meta HSTN smart glasses. Meta has announced a new software update for its AI-powered smart glasses, adding features aimed at improving everyday usability. The v21 update introduces Conversation Focus, designed to make voices easier to hear in noisy environments, and a new AI-driven Spotify feature that plays music based on what the wearer sees. The update is rolling out gradually, starting today, to users enrolled in Meta's Early Access Program. The v21 software update focuses on audio clarity and contextual AI use. Conversation Focus, first previewed at Meta Connect earlier this year, is now rolling out to Early Access users of Ray-Ban Meta and Oakley Meta HSTN smart glasses in the US and Canada. The feature uses the glasses' open-ear speakers to amplify the voice of the person you are speaking to while reducing the impact of surrounding background noise. The amplified voice sounds slightly louder, making it easier to follow conversations in places like restaurants, public transport, or crowded events. Users can adjust the level of amplification by swiping the right temple of the glasses or through device settings. Also Read: Android Show XR Edition announces new Galaxy XR features and first look at Project Aura glasses Meta is also introducing a new multimodal music experience in partnership with Spotify. Using Meta AI, wearers can ask the glasses to play a song that matches what they are viewing by saying, 'Hey Meta, play a song to match this view.' The feature combines on-device computer vision with Spotify's recommendation system to generate music tailored to the scene and the user's listening preferences. This feature is available in English across multiple markets, including India, the UK, the US, and several parts of Europe, the Middle East, and Latin America. Software updates are being released in phases, with Early Access members receiving the features first. Users who are not enrolled can join the Early Access waitlist through Meta. Meta has been steadily expanding the capabilities of its smart glasses through software rather than new hardware. Earlier updates have focused on voice assistance, visual recognition, and hands-free capture. Conversation Focus builds on Meta's push to make smart glasses more practical in real-world settings, while the Spotify integration highlights the company's broader strategy around multimodal AI, where voice, vision, and context work together. For users already invested in Meta's smart glasses ecosystem, the v21 update adds meaningful quality-of-life improvements without requiring a hardware upgrade. For prospective buyers, it reinforces Meta's approach of improving value over time through regular software updates rather than rapid model refreshes. Keep reading Digit.in to learn about such updates.
Share
Share
Copy Link
Meta rolled out its v21 software update for AI glasses, introducing Conversation Focus to help users hear conversations better in crowded spaces. The update also brings AI-powered Spotify integration that plays music matching your view, plus Telugu and Kannada language support for users in India.
Meta has launched a significant software update for its AI glasses lineup, bringing the long-awaited Conversation Focus feature to Ray-Ban Meta smartglasses and Oakley Meta HSTN frames. First previewed at Meta Connect in September, Conversation Focus uses beam-forming microphones to amplify voices in noisy environments, making it easier to hear the person you're speaking with in busy restaurants, bars, clubs, or commuter trains
1
3
. The feature works by directionally focusing on whoever is in front of you while filtering out background noise, creating a slightly brighter sound quality that helps distinguish conversation from ambient sounds4
.
Source: TechCrunch
The noise filtration and voice amplification technology leverages the open-ear speakers already built into Meta's AI glasses to enhance speech clarity. Users can activate the feature through voice commands by saying "hey Meta, start Conversation Focus" or by adding it as a dedicated tap-and-hold shortcut
4
. What makes this particularly practical is the ability to adjust amplification levels by swiping the right temple of the glasses or through device settings, allowing wearers to fine-tune the experience based on their specific environment1
. While not explicitly marketed as an accessibility feature, the hearing assistance glasses functionality positions Meta's wearables alongside similar offerings from Apple, whose AirPods include Conversation Boost and clinical-grade Hearing Aid features in Pro models1
.
Source: CNET
The v21 software update for smart glasses also introduces Meta's first multimodal AI music experience through an AI-powered Spotify integration. This new feature allows users to ask Meta AI to play songs that match what they're looking at, combining on-device vision with Spotify's personalization engine
5
. For example, looking at an album cover could trigger music by that artist, while viewing a Christmas tree with gifts might start a holiday playlist1
. Users simply say "hey Meta, play a song to match this view" and Spotify generates a playlist based on their unique taste, customized for that specific moment4
. Google has announced similar functionality coming to its Project Aura glasses next year, suggesting this represents an emerging trend in AI-powered wearables3
.
Source: Benzinga
Related Stories
In a strategic move to deepen its presence in the Indian market, Meta added Kannada & Telugu Support to its AI glasses, enabling fully hands-free interaction with Meta AI in these regional languages
5
. This expansion beyond English and Hindi makes the devices more accessible to millions of users across diverse linguistic communities in India. The update also brings new AI-powered features to the Oakley Meta Vanguard model, including single-word voice commands that let athletes say "photo" or "video" to capture content without the full "hey Meta" prompt, plus programmable workouts using Garmin watches4
2
.The software update is initially rolling out to users enrolled in Meta's Early Access Program, which requires joining a waitlist and approval, before becoming available more broadly
1
. While Conversation Focus is limited to the U.S. and Canada, the Spotify feature is available in English across 19 markets including Australia, Austria, Belgium, Brazil, Canada, Denmark, Finland, France, Germany, India, Ireland, Italy, Mexico, Norway, Spain, Sweden, the United Arab Emirates, the U.K., and the U.S.1
. The update affects both Gen 1 and Gen 2 Ray-Ban Meta models as well as Oakley Meta HSTN frames4
. As competition intensifies with Apple and Google both developing hearing enhancement features for their respective wearables, Meta's move signals how smart glasses are evolving from novelty items into practical tools that address real-world challenges in audio clarity and contextual computing.Summarized by
Navi
[4]
1
Business and Economy

2
Policy and Regulation

3
Technology
