2 Sources
2 Sources
[1]
This robotic dog talks with ChatGPT magic and guides the visually impaired
Researchers built a robotic guide dog that talks users through the journey Robot dogs aren't some new innovation, but one that can back to your sounds like science fiction. Researchers at Binghamton University say they have already built one, and it is meant to help the blind. The team from the university describes an AI-powered robotic guide dog system designed to aid the visually impaired users navigate indoor spaces while also communicating with them during the journey. The big twist is that it uses large language models (LLMs), specifically GPT-4, to make the robot more conversational and responsive than a traditional guide dog could be. How does the AI guide dog work? According to Binghamton University, the system was developed by Shiqi Zhang, an associate professor in the School of Computing, and his team. Zhang stated that the project shows how robotic guide dogs can go beyond the limits of actual guide dogs, who can only understand a small set of commands. Recommended Videos Using GPT-4 with voice commands, the AI-powered robot dog gains much stronger conversational capabilities. The setup isn't about just getting the user from one point to another. Before the trip even begins, the robot can describe the possible routes and estimated travel times. During the journey, it offers what researchers call "scene verbalization," giving real-time spoken feedback about the environment and obstacles ahead. In one example shared by the report, the AI guide dog may say something like "this is a long corridor" while guiding the user to a conference room. It's already being tested with blind participants To evaluate the system, the researchers recruited seven legally blind participants and had them navigate a large, multi-room office environment. The participants then completed a questionnaire rating the system's helpfulness, usefulness, and ease of communication. And the results? Users preferred the combined approach of route planning explanation along with live narration during the travel. It isn't about going from point A to point B -- it is about giving users more situational awareness and more control over how they move through a space. And just like how AI is being used to find pets, this is one of those positive headlines around AI.
[2]
These AI-Powered Guide Dogs Don't Just Lead - They Talk | Newswise
Newswise -- Guide dogs are powerful allies, leading the visually impaired safely to their destinations, but they can't talk with their owners -- until now. Using large language models, a team of researchers at Binghamton University, State University of New York has created a talking robot guide dog system that determines an ideal route and safely guides users to their destination, offering real-time feedback along the way. "For this work, we're demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs," said Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science's School of Computing. "Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities." Zhang and his team had previously trained robot guide dogs to lead the visually impaired by responding to a tug on the leash. This new system takes their work a step further, creating a spoken back-and-forth between user and dog, and providing more control and situational awareness. The robot offers information about a route before departure (what the researchers call plan verbalization) and information during travel (scene verbalization.) "This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision," Zhang said. To test the system, the team recruited seven legally blind participants to navigate a large, multi-room office environment. The robot would ask the user where they wanted to go (in this experiment, a conference room) and then present possible routes to the room and the time it would take to reach it. Once the user selected a preferred route, the robot would guide them to the conference room, verbalizing the surroundings and obstacles along the way (such as "this is a long corridor") until it reached the destination. Following the test, the users completed a questionnaire about their experience, rating the system's helpfulness, ease of communication, and usefulness. Overall, a combined approach -- which included planning explanations and real-time narration from the robot -- was preferred among participants. A simulated study of the system also showed that this approach was successful. Going forward, the team plans to conduct more user studies, increase the system's autonomy, and have the robots navigate longer distances, both indoors and outdoors. The goal of this research is to help integrate robotic guide dogs into everyday life. The study participants were enthusiastic about this possibility. "They were super excited about the technology, about the robots," Zhang said. "They asked many questions. They really see the potential for the technology and hope to see this working." The paper, "From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication," was presented at the 40th Annual AAAI Conference on Artificial Intelligence, one of the largest academic AI conferences in history.
Share
Share
Copy Link
Researchers at Binghamton University developed an AI-powered robotic guide dog system that uses GPT-4 to communicate with visually impaired users. The robot provides route planning and real-time scene verbalization during navigation. Seven legally blind participants tested the system in a multi-room office environment, with results showing strong preference for the combined approach of planning explanations and live narration.
Researchers at Binghamton University have developed an AI-powered robotic guide dog system that goes beyond traditional guide dogs by talking to users throughout their journey
1
2
. The system was created by Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science's School of Computing, along with his team. By integrating large language models, specifically GPT-4, the robotic guide dog can assist visually impaired individuals through verbal communication with visually impaired users, offering capabilities that biological guide dogs cannot match2
.
Source: Newswise
While traditional guide dogs can understand around 20 commands at best, this AI system leverages GPT-4 with voice commands to deliver much stronger conversational capabilities
2
. The technology represents a significant step forward in how robotic systems can enhance situational awareness for those navigating without vision.The system operates through two key features: plan verbalization and real-time scene verbalization
2
. Before the journey begins, the robotic guide dog asks users where they want to go and then describes possible routes along with estimated travel times1
. This route planning capability gives users more control over their navigation choices, allowing them to make informed decisions about their path.During travel, the robot provides real-time feedback about the environment and obstacles ahead through spoken interaction
1
. For example, the AI guide dog might say "this is a long corridor" while guiding a user to a conference room1
. Zhang emphasized that this feature is particularly important for the visually impaired because "situational and scene awareness is relatively limited without vision"2
.To evaluate the system's effectiveness, researchers recruited seven legally blind participants to navigate a large, multi-room office environment
1
2
. Participants completed a questionnaire rating the system's helpfulness, usefulness, and ease of communication. The results showed that users preferred the combined approach of route planning explanation along with live narration during travel1
. A simulated study also demonstrated the approach's success2
.Zhang noted that participants were enthusiastic about the technology's potential. "They were super excited about the technology, about the robots," he said. "They asked many questions. They really see the potential for the technology and hope to see this working"
2
.Related Stories
The research team has ambitious plans to expand the system's capabilities. Going forward, they aim to conduct more user studies, increase the system's autonomy, and have the robots navigate longer distances, both indoors and outdoors
2
. The ultimate goal is to integrate robotic guide dogs into everyday life for the visually impaired.The research paper, titled "From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication," was presented at the 40th Annual AAAI Conference on Artificial Intelligence, one of the largest academic AI conferences in history
2
. As the team continues to refine the technology, the focus remains on giving users more situational awareness and control over how they move through spaces, demonstrating how AI can address real accessibility challenges.Summarized by
Navi
16 Oct 2024•Technology

11 Oct 2024•Technology

12 Jul 2025•Science and Research

1
Technology

2
Technology

3
Science and Research
