5 Sources
[1]
Ray-Ban Meta Glasses Get Smarter for Low Vision Users. Here's How
If you haven't heard, AI now has eyes, and Meta has announced some enhancements for its Ray-Ban Meta glasses. You can now customize Meta AI to give detailed responses based on what's in the surrounding environment for your smart glasses, Meta said in a blog post for Global Accessibility Awareness Day. Artificial intelligence is opening a whole new world for accessibility, with new features coming out in droves. Tech giants like Google, Apple and Meta are putting forth a ton of effort to create a world where people with disabilities, such as low or no vision, can more easily interact with the world around them. While Live AI for the Meta glasses has been around, the additional enhancements for low vision users will undoubtedly be welcomed. Below are some of the other highlights from Meta's accessibility-focused blog post. For more, check out the glimpse of brain accessibility features headed to Apple devices. While not an AI-focused feature, the Meta and Be My Eyes feature, Call a Volunteer, will soon be expanding to all 18 countries that Meta AI is currently available in. Launched in November 2024 in The US, Canada, UK, Ireland and Australia, the expansion of Call a Volunteer will be a very handy (and hands-free) feature for low vision users. Once set up, a Meta glasses user can simply ask AI to "Be My Eyes." From there, you'll be connected to one of over 8 million volunteers that will be able to view the live camera stream from your glasses and provide real-time assistance for whatever you need help with. The feature will be available to all supported countries later this month. Meta also detailed some of its existing features and research taking place in its effort to expand accessibility for its products, especially in the extended reality space.
[2]
Meta Ray-Bans get two new accessibility features for low-vision users
The smart glasses can now help users call a volunteer for assistance and get an audio description of the world around them. Meta Ray-Bans already provide plenty of utility for people with low vision, but the glasses are about to get even better. Also: You can zoom to read in Chrome for Android with these new AI and accessibility features Meta has announced two new accessibility advancements for its smart glasses designed to help users navigate the world around them. The first feature is available now. Meta said users can get descriptive responses about what their glasses see. Meta explained that this feature is useful for anyone, especially those with vision impairments. Also: I've worn Meta Ray-Bans for months and these 5 features never get old In a demonstration video, a user with a vision impairment asked her glasses what they saw. "I see a pathway leading to a grassy area with trees and a body of water in the distance," the glasses responded by describing tree height, giving more details about the pathway, and mentioning that the grassy area was "well manicured." Later in the video, a user asked Meta's glasses what they saw, and the user received a breakdown of food items on a kitchen countertop. A feature called Live AI already provides conversation support. However, that feature is focused on receiving additional information or asking questions. This new feature better explains what's being seen. Also: Apple's Meta Ray-Bans killer is only one of four major launches in 2027 - here's the list To start using this feature, head to your device settings in the Meta AI app and turn on "detailed responses" under Accessibility. The second accessibility feature arrives later this month. A new "Call a Volunteer" feature will let users place a call through their glasses to a service called Be My Eyes. Also: I tested Meta's transparent Ray-Ban smart glasses, and they're a near-perfect accessory for me This app will connect blind or low-vision users who want assistance with sighted volunteers globally to get help with everyday tasks like choosing a shirt color or reading a label at the grocery store. Users will be able to sign up for the service through the app. Volunteers will see through the user's glasses to provide guidance.
[3]
Your Meta Ray-Bans just got two seriously helpful upgrades for free - how they work
As part of the new features, Meta said it partnered with Be My Eyes. Here's why that's a big deal. Meta Ray-Bans already provide plenty of utility for people with low vision, but the glasses are about to get even better. Also: You can zoom to read in Chrome for Android with these new AI and accessibility features Meta has announced two new accessibility advancements for its smart glasses designed to help users navigate the world around them. The first feature is available now. Meta said users can get descriptive responses about what their glasses see. Meta explained that this feature is useful for anyone, especially those with vision impairments. Also: I've worn Meta Ray-Bans for months and these 5 features never get old In a demonstration video, a user with a vision impairment asked her glasses what they saw. "I see a pathway leading to a grassy area with trees and a body of water in the distance," the glasses responded by describing tree height, giving more details about the pathway, and mentioning that the grassy area was "well manicured." Later in the video, a user asked Meta's glasses what they saw, and the user received a breakdown of food items on a kitchen countertop. A feature called Live AI already provides conversation support. However, that feature is focused on receiving additional information or asking questions. This new feature better explains what's being seen. Also: Apple's Meta Ray-Bans killer is only one of four major launches in 2027 - here's the list To start using this feature, head to your device settings in the Meta AI app and turn on "detailed responses" under Accessibility. The second accessibility feature arrives later this month. A new "Call a Volunteer" feature will let users place a call through their glasses to a service called Be My Eyes. Also: I tested Meta's transparent Ray-Ban smart glasses, and they're a near-perfect accessory for me This app will connect blind or low-vision users who want assistance with sighted volunteers globally to get help with everyday tasks like choosing a shirt color or reading a label at the grocery store. Users will be able to sign up for the service through the app. Volunteers will see through the user's glasses to provide guidance.
[4]
Your Meta Ray-Bans just got two major accessibility upgrades for free - how they work
The smart glasses can now call a volunteer for assistance and get detailed audio descriptions about the world around them. Meta Ray-Bans already provide plenty of utility for people with low vision, but the glasses are about to get even better. Also: You can zoom to read in Chrome for Android with these new AI and accessibility features Meta has announced two new accessibility advancements for its smart glasses designed to help users navigate the world around them. The first feature is available now. Meta said users can get descriptive responses about what their glasses see. Meta explained that this feature is useful for anyone, especially those with vision impairments. Also: I've worn Meta Ray-Bans for months and these 5 features never get old In a demonstration video, a user with a vision impairment asked her glasses what they saw. "I see a pathway leading to a grassy area with trees and a body of water in the distance," the glasses responded by describing tree height, giving more details about the pathway, and mentioning that the grassy area was "well manicured." Later in the video, a user asked Meta's glasses what they saw, and the user received a breakdown of food items on a kitchen countertop. A feature called Live AI already provides conversation support. However, that feature is focused on receiving additional information or asking questions. This new feature better explains what's being seen. Also: Apple's Meta Ray-Bans killer is only one of four major launches in 2027 - here's the list To start using this feature, head to your device settings in the Meta AI app and turn on "detailed responses" under Accessibility. The second accessibility feature arrives later this month. A new "Call a Volunteer" feature will let users place a call through their glasses to a service called Be My Eyes. Also: I tested Meta's transparent Ray-Ban smart glasses, and they're a near-perfect accessory for me This app will connect blind or low-vision users who want assistance with sighted volunteers globally to get help with everyday tasks like choosing a shirt color or reading a label at the grocery store. Users will be able to sign up for the service through the app. Volunteers will see through the user's glasses to provide guidance.
[5]
Latest Ray-Ban Meta Glasses Update Adds Descriptive Responses and Live AI
Besides, Meta is also rolling out Live AI, Live Translation, and Sign-speak. The Ray-Ban Metas are great for those who love capturing moments or asking Meta AI for help with just a touch of a button. Meta recently expanded Meta AI to more countries, including India, alongside launching the glasses there. One of the underrated uses of the smart glasses is they could help disabled people by being their eyes. That aspect of the device just got better with Meta rolling out a few extremely useful features, including descriptive responses on Ray-Ban Meta. Meta announced in a blog post that users can now customize Meta AI on the Ray Ban smart glasses to be more descriptive. For those unaware, a popular use case of the glasses is that it can describe what's in front of you. While it may seem like a standard AI feature, it comes in handy for people with low vision or blindness. More descriptive AI replies could help people navigate through their surrounding much better. The feature is rolling out to users in the US and Canada and will expand to other countries soon. Users can enable it by going to Settings > Accessibility section of the Meta AI app. Also, the Be My Eyes is also expanding to more countries. Meta recently rolled out version 15 for the smart glasses, bringing features like Live AI to the US and Canada. Besides, it also brings Live Translation to all countries where Meta AI is available. This will come alongside a more personalized Meta AI experience and expansion in more European countries like Austria, Ireland, Sweden, and more. In the post, Meta also showcased a new wristband device for better human-computer interactions and a Live Speech feature. It converts spoken words into real-time text, much like Google's Live Captions. Lastly, there's Sign-speak, a new WhatsApp chatbot that can translate American Sign Language. What are your thoughts on Meta AI's descriptive responses coming to the Ray Ban smart glasses? Let us know in the comments below.
Share
Copy Link
Meta has introduced new AI-driven accessibility features for its Ray-Ban smart glasses, including detailed visual descriptions and volunteer assistance, aimed at helping users with low vision navigate their surroundings more effectively.
Meta has announced significant enhancements to its Ray-Ban smart glasses, focusing on improving accessibility for users with low vision. These new features, unveiled on Global Accessibility Awareness Day, leverage artificial intelligence to provide more detailed responses about the user's surroundings and offer volunteer assistance 1.
The first major update allows users to receive more descriptive responses about what their glasses see. This feature, available now, is particularly beneficial for those with vision impairments. Users can activate this function by enabling "detailed responses" under Accessibility in the Meta AI app settings 2.
In a demonstration, the glasses provided detailed descriptions of environments, such as:
"I see a pathway leading to a grassy area with trees and a body of water in the distance," followed by specifics about tree height, pathway details, and the condition of the grassy area 3.
The second major accessibility enhancement, set to launch later this month, is the "Call a Volunteer" feature. This function connects users with the Be My Eyes service, allowing them to place calls through their glasses to a network of over 8 million volunteers worldwide 1.
Volunteers can view the live camera stream from the user's glasses, providing real-time assistance for various tasks such as choosing clothing colors or reading labels at grocery stores 4.
Meta is expanding these services to all 18 countries where Meta AI is currently available. The Call a Volunteer feature, initially launched in November 2024 in select countries, will be accessible to all supported regions by the end of the month 1.
Meta has also introduced other features in its version 15 update for the smart glasses:
Meta is working on additional accessibility tools, including:
These advancements in AI-powered accessibility features demonstrate Meta's commitment to creating more inclusive technologies, potentially opening new possibilities for individuals with visual impairments to interact with their environment more effectively.
Summarized by
Navi
Databricks raises $1 billion in a new funding round, valuing the company at over $100 billion. The data analytics firm plans to invest in AI database technology and an AI agent platform, positioning itself for growth in the evolving AI market.
11 Sources
Business
14 hrs ago
11 Sources
Business
14 hrs ago
SoftBank makes a significant $2 billion investment in Intel, boosting the chipmaker's efforts to regain its competitive edge in the AI semiconductor market.
22 Sources
Business
22 hrs ago
22 Sources
Business
22 hrs ago
OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.
15 Sources
Technology
22 hrs ago
15 Sources
Technology
22 hrs ago
Microsoft introduces a new AI-powered 'COPILOT' function in Excel, allowing users to perform complex data analysis and content generation using natural language prompts within spreadsheet cells.
8 Sources
Technology
14 hrs ago
8 Sources
Technology
14 hrs ago
Adobe launches Acrobat Studio, integrating AI assistants and PDF Spaces to transform document management and collaboration, marking a significant evolution in PDF technology.
10 Sources
Technology
14 hrs ago
10 Sources
Technology
14 hrs ago