4 Sources
[1]
Speak! Robot guide dogs converse with their owners
The robotic dog used in the study was a Unitree Go2 modelJonathan Cohen Since the early 1900s, dogs have helped people who are blind or have low vision to navigate their world. Now, in a very 21st century twist, seeing-eye dogs have gone robotic and added a skill that not even the most well-trained canine could pull off: conversation. Seeing-eye dogs are undoubtedly one of the clearest examples of human-canine bonding. Not only do they help keep their owners safe, but they also provide comfort and companionship to people who can often feel isolated. Yet these clever canines take a long time to train, with only 50-60% graduating the programs that make them fit to work with people who are blind or have low vision. That means that they are expensive, with costs ranging between US$20,000-50,000. As a result, only about 2-5% of the blind community are able to have a seeing-eye dog. These facts led Shiqi Zhang, an associate professor at Binghamton University, to investigate an alternative. In 2022 he and his students went trick-or-treating with a quadruped robotic dog. In 2023, he decided to give that dog a more important role and trained it to respond to leash tugs to help it work more like a guide dog. Now, Zhang and his team have gone one step further and trained a Unitree Go2 robotic dog using a large language model via AI tool GPT-4 to question and respond to cues from the user and the environment. "For this work, we're demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs," said Zhang. "Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities." To test the robo dogs, Zhang's team recruited seven legally blind participants who were asked to navigate a big multi-room indoor environment. The bot first asked each participant where they wanted to go, and then as it was guiding them there, provided clues about the environment such as: "this is a long corridor" or "you're passing by the main lobby, which is an open area with seating and information desks." You can see one of the tests in progress in the following video. Based on questionnaire data collected at the end of each test, the participants indicated that they preferred the combination of verbal and physical guidance through the environment rather than just being pulled along. However the participants did give the guide dog slightly lower marks in terms of its perceived safety, which the researchers say is likely to do with the unfamiliarity of walking alongside a robot. That didn't dampen their enthusiasm for the bots though, says Zhang. "They were super excited about the technology, about the robots," he said. "They asked many questions. They really see the potential for the technology and hope to see this working." In additional testing, the team had GPT-4 use natural language commands to run the dog through 77 different navigation scenarios, each of which it was able to complete successfully. Now the researchers plan to carry out more studies in which the bots will navigate longer distances both indoors and out. They will also be working on amping up the autonomy of the system. The paper describing the research was presented in January at the 40th Annual AAAI Conference on Artificial Intelligence in Singapore.
[2]
AI-powered robotic guide dog that talks could become alternative to service animals
Researchers report that a robotic guide dog can now talk users through a trip, offering route choices before departure and live updates while walking. That new ability turns navigation into a shared conversation, giving blind travelers more control over where they go and what they know along the way. Inside a large office suite, seven legally blind volunteers used the system on a trip to a conference room. There, Shiqi Zhang at Binghamton University in New York demonstrated a robot that outlined route options before it began to move. Once a participant chose a path, the machine kept the same conversation going by describing hallways and obstacles aloud instead of relying only on leash pressure. That result shows the system's promise, but it also raises the next question: how much safer and more useful spoken guidance becomes during a real walk. Before each walk, the system turned a spoken request into several possible destinations and laid out more than one route. Route planning used a large language model to keep the exchange conversational instead of rigid. The robot also weighed practical details like travel distance and door openings, then summarized those tradeoffs so the handler could choose. Because route planning happened in plain speech, the machine gave users reasons to consider before moving. During the trip, the robot provided spoken updates about nearby surroundings, allowing the handler to understand what was changing in real time. Corridors, doors, and obstacles became part of a running explanation, which helped users build a clearer mental map. Unlike a basic alert system, this one described context as well as danger, which supports safer choices at the next turn. Still, the paper notes that description alone is not enough unless the robot can keep pace and move safely. When six questionnaires were tallied, the version combining route explanations with live narration scored best on utility and communication. Average scores reached 4.83 for usefulness and 4.50 for ease of communication on a five-point scale. Yet the safety rating slipped to 3.83, a reminder that strong communication does not remove trust concerns. The lower score set up the hardest problem ahead, making conversation impressive but full autonomy still unfinished. Earlier versions from the same Binghamton University research group responded when a handler tugged the leash, letting the human signal direction without speech. That earlier work solved the physical side of teamwork, while the new system tackled conversation and shared planning. Real guide dogs are excellent at movement and safety, but they usually respond to short, trained commands rather than open conversation. Speech let the robot address the gap between moving safely and understanding what the human wants. Language became useful only because the robot already knew the building map and the labeled places inside it. When someone said they were thirsty, the system could connect that request to a fountain or vending machine. Researchers call that link grounding, matching words to real objects and locations, and it keeps the conversation tied to action. Without that link, a chatty machine could sound capable while sending a user somewhere useless or impossible. Safety limited what the researchers could test, so a hidden expert controlled the robot's movement during the real walks. That "Wizard of Oz" setup, where a person runs the machine behind the scenes, reduced risk while preserving the conversation test. Because the robot was not yet fully autonomous, the results say more about communication value than complete real-world readiness. Further progress will depend on longer indoor trips and outdoor testing, which keeps the most difficult questions open. Good navigation tools do more than avoid collisions, and they also help people understand space, timing, and available choices. A related study found that blind participants wanted robotic guides to mirror familiar guide-dog communication. "This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision," said Zhang. Users in the trial seemed to value explanation itself, not just the robot's ability to steer. Future versions from the SUNY lab will need longer routes, outdoor travel, better autonomy, and faster handling of complex speech. Participants asked many questions afterward and treated the robot guide dog as something they wanted in real life. User excitement points to the real test ahead, where daily trust will matter more than whether a system can complete a demonstration. If those pieces come together, robotic guide dogs could become a practical option for people who cannot use animals. What this team built was not just a speaking robot, but a new kind of mobility aid that explains itself. Conversation will not replace safe movement, yet it may decide whether robotic guides feel like machines people can truly live with. The study is published in the Proceedings of the AAAI Conference on Artificial Intelligence. -- - Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
[3]
This robotic dog talks with ChatGPT magic and guides the visually impaired
Researchers built a robotic guide dog that talks users through the journey Robot dogs aren't some new innovation, but one that can back to your sounds like science fiction. Researchers at Binghamton University say they have already built one, and it is meant to help the blind. The team from the university describes an AI-powered robotic guide dog system designed to aid the visually impaired users navigate indoor spaces while also communicating with them during the journey. The big twist is that it uses large language models (LLMs), specifically GPT-4, to make the robot more conversational and responsive than a traditional guide dog could be. How does the AI guide dog work? According to Binghamton University, the system was developed by Shiqi Zhang, an associate professor in the School of Computing, and his team. Zhang stated that the project shows how robotic guide dogs can go beyond the limits of actual guide dogs, who can only understand a small set of commands. Recommended Videos Using GPT-4 with voice commands, the AI-powered robot dog gains much stronger conversational capabilities. The setup isn't about just getting the user from one point to another. Before the trip even begins, the robot can describe the possible routes and estimated travel times. During the journey, it offers what researchers call "scene verbalization," giving real-time spoken feedback about the environment and obstacles ahead. In one example shared by the report, the AI guide dog may say something like "this is a long corridor" while guiding the user to a conference room. It's already being tested with blind participants To evaluate the system, the researchers recruited seven legally blind participants and had them navigate a large, multi-room office environment. The participants then completed a questionnaire rating the system's helpfulness, usefulness, and ease of communication. And the results? Users preferred the combined approach of route planning explanation along with live narration during the travel. It isn't about going from point A to point B -- it is about giving users more situational awareness and more control over how they move through a space. And just like how AI is being used to find pets, this is one of those positive headlines around AI.
[4]
These AI-Powered Guide Dogs Don't Just Lead - They Talk | Newswise
Newswise -- Guide dogs are powerful allies, leading the visually impaired safely to their destinations, but they can't talk with their owners -- until now. Using large language models, a team of researchers at Binghamton University, State University of New York has created a talking robot guide dog system that determines an ideal route and safely guides users to their destination, offering real-time feedback along the way. "For this work, we're demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs," said Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science's School of Computing. "Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities." Zhang and his team had previously trained robot guide dogs to lead the visually impaired by responding to a tug on the leash. This new system takes their work a step further, creating a spoken back-and-forth between user and dog, and providing more control and situational awareness. The robot offers information about a route before departure (what the researchers call plan verbalization) and information during travel (scene verbalization.) "This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision," Zhang said. To test the system, the team recruited seven legally blind participants to navigate a large, multi-room office environment. The robot would ask the user where they wanted to go (in this experiment, a conference room) and then present possible routes to the room and the time it would take to reach it. Once the user selected a preferred route, the robot would guide them to the conference room, verbalizing the surroundings and obstacles along the way (such as "this is a long corridor") until it reached the destination. Following the test, the users completed a questionnaire about their experience, rating the system's helpfulness, ease of communication, and usefulness. Overall, a combined approach -- which included planning explanations and real-time narration from the robot -- was preferred among participants. A simulated study of the system also showed that this approach was successful. Going forward, the team plans to conduct more user studies, increase the system's autonomy, and have the robots navigate longer distances, both indoors and outdoors. The goal of this research is to help integrate robotic guide dogs into everyday life. The study participants were enthusiastic about this possibility. "They were super excited about the technology, about the robots," Zhang said. "They asked many questions. They really see the potential for the technology and hope to see this working." The paper, "From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication," was presented at the 40th Annual AAAI Conference on Artificial Intelligence, one of the largest academic AI conferences in history.
Share
Copy Link
Researchers at Binghamton University developed an AI-powered guide dog using GPT-4 that converses with blind users, offering route options and real-time environmental updates. Seven legally blind participants tested the system, rating combined verbal and physical guidance highly. The innovation addresses the shortage of traditional service animals, which cost $20,000-50,000 and serve only 2-5% of the blind community.
Researchers at Binghamton University have developed a robotic guide dog that goes beyond traditional service animals by adding a capability no canine could match: conversation. Led by associate professor Shiqi Zhang, the team equipped a Unitree Go2 robotic dog with large language models, specifically GPT-4, to create an assistive technology that can discuss routes, describe surroundings, and respond to complex voice commands
1
4
.
Source: Newswise
The innovation addresses a critical gap in accessibility. Traditional seeing-eye dogs require extensive training, with only 50-60% graduating from programs, and cost between $20,000-50,000. As a result, only 2-5% of the blind community can access these animals
1
. The robotic alternative promises to expand access while offering enhanced situational awareness through verbal communication.The AI robotic guide dog system operates through what researchers call "plan verbalization" and "scene verbalization." Before departure, the robot asks users where they want to go, then presents multiple route options with estimated travel times. During the journey, it provides real-time spoken feedback about the environment, such as "this is a long corridor" or "you're passing by the main lobby, which is an open area with seating and information desks"
1
2
.Shiqi Zhang emphasizes the advantage over biological guide dogs: "Real dogs can understand around 20 commands at best. But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities"
1
4
.The system relies on link grounding, matching spoken words to real objects and locations within a mapped environment. When a user mentions thirst, the robot can connect that request to a fountain or vending machine, keeping conversation tied to actionable navigation
2
.Seven legally blind participants navigated a large, multi-room office environment to test the system. Following their experience, they completed questionnaires rating helpfulness, ease of communication, and usefulness on a five-point scale. The combined approach of route planning and navigation with real-time narration scored 4.83 for usefulness and 4.50 for ease of communication
2
3
.Participants preferred the combination of verbal and physical guidance through leash pressure rather than being pulled along without explanation. However, the safety rating slipped to 3.83, which researchers attribute to unfamiliarity with walking alongside a robot rather than the traditional human-canine bonding experience
1
2
.Despite trust concerns about safety, participants showed enthusiasm for the technology. "They were super excited about the technology, about the robots," Zhang said. "They asked many questions. They really see the potential for the technology and hope to see this working"
1
4
.Related Stories
The research team plans to conduct more studies with longer indoor distances and outdoor testing. Increasing system autonomy remains a priority, as current demonstrations used a "Wizard of Oz" setup where a hidden expert controlled robot movement during walks to reduce risk while preserving the conversation test
2
.In additional testing, GPT-4 successfully navigated the dog through 77 different scenarios using natural language commands
1
. The team presented their paper "From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication" at the 40th Annual AAAI Conference on Artificial Intelligence in January4
.For the blind community to assist visually impaired users effectively, robotic guide dogs must prove they can handle complex environments beyond controlled office settings. The lower safety rating signals that building daily trust will matter more than demonstration success. Yet user autonomy through conversation represents a meaningful step forward, turning navigation into a shared dialogue rather than simple commands. If researchers can solve autonomy challenges and outdoor navigation, AI-powered guide dogs could become a practical mobility aid that explains itself while moving safely through the world.
Summarized by
Navi
[1]
16 Oct 2024•Technology

12 Jul 2025•Science and Research

11 Oct 2024•Technology
