Meta Enhances Ray-Ban Smart Glasses with AI-Powered Accessibility Features

5 Sources

Share

Meta has introduced new AI-driven accessibility features for its Ray-Ban smart glasses, including detailed visual descriptions and volunteer assistance, aimed at helping users with low vision navigate their surroundings more effectively.

News article

Meta Introduces AI-Powered Accessibility Features for Ray-Ban Smart Glasses

Meta has announced significant enhancements to its Ray-Ban smart glasses, focusing on improving accessibility for users with low vision. These new features, unveiled on Global Accessibility Awareness Day, leverage artificial intelligence to provide more detailed responses about the user's surroundings and offer volunteer assistance

1

.

Enhanced Visual Descriptions with AI

The first major update allows users to receive more descriptive responses about what their glasses see. This feature, available now, is particularly beneficial for those with vision impairments. Users can activate this function by enabling "detailed responses" under Accessibility in the Meta AI app settings

2

.

In a demonstration, the glasses provided detailed descriptions of environments, such as:

"I see a pathway leading to a grassy area with trees and a body of water in the distance," followed by specifics about tree height, pathway details, and the condition of the grassy area

3

.

"Call a Volunteer" Feature with Be My Eyes

The second major accessibility enhancement, set to launch later this month, is the "Call a Volunteer" feature. This function connects users with the Be My Eyes service, allowing them to place calls through their glasses to a network of over 8 million volunteers worldwide

1

.

Volunteers can view the live camera stream from the user's glasses, providing real-time assistance for various tasks such as choosing clothing colors or reading labels at grocery stores

4

.

Expansion of Services

Meta is expanding these services to all 18 countries where Meta AI is currently available. The Call a Volunteer feature, initially launched in November 2024 in select countries, will be accessible to all supported regions by the end of the month

1

.

Additional Features and Updates

Meta has also introduced other features in its version 15 update for the smart glasses:

  1. Live AI: Provides conversation support and additional information.
  2. Live Translation: Now available in all countries where Meta AI is supported.
  3. Personalized Meta AI experience: Expanding to more European countries

    5

    .

Future Developments

Meta is working on additional accessibility tools, including:

  1. A new wristband device for improved human-computer interactions.
  2. Live Speech: A feature that converts spoken words into real-time text.
  3. Sign-speak: A WhatsApp chatbot capable of translating American Sign Language

    5

    .

These advancements in AI-powered accessibility features demonstrate Meta's commitment to creating more inclusive technologies, potentially opening new possibilities for individuals with visual impairments to interact with their environment more effectively.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo