Iván Hernández Dalas: Binghamton researchers create robotic guide dogs that walk — and talk

Scientists at Binghamton University have developed a robotic guide dog that communicates with the visually impaired and provides real-time feedback during travel.

Scientists have developed a robotic guide dog that communicates with the visually impaired and provides real-time feedback during travel. Source: Jonathan Cohen, Binghamton University

Guide dogs are powerful allies, leading the visually impaired safely to their destinations, but they can’t talk with their owners — until now.

Using large language models (LLMs), a team of researchers at Binghamton University, part of the State University of New York, has created a talking robot guide dog. The system can determine an ideal route and safely guide users to their destinations, offering real-time feedback along the way.

“For this work, we’re demonstrating an aspect of the robotic guide dog that is more advanced than biological guide dogs,” said Shiqi Zhang, an associate professor at the Thomas J. Watson College of Engineering and Applied Science’s School of Computing.

“Real dogs can understand around 20 commands at best,” he noted. “But for robotic guide dogs, you can just put GPT-4 with voice commands. Then it has very strong language capabilities.”

Binghamton researchers teach robotic dogs new tricks

Zhang and his team had previously trained robot guide dogs to lead the visually impaired by responding to a tug on the leash. This new system takes their work a step further, creating a spoken exchange between user and dog, and providing more control and situational awareness.

Shiqi Zhang, an associate professor at Binghamton University's School of Computing, developed the robot guide dog system with his students. Image Credit: Jonathan Cohen

Shiqi Zhang, an associate professor at Binghamton University’s School of Computing, developed the robot guide dog system with his students. Credit: Jonathan Cohen

The quadruped robot offers information about a route before departure — what the researchers called “plan verbalization” — and information during travel, or “scene verbalization.”

“This is very important for visually impaired or blind people, because situational and scene awareness is relatively limited without vision,” Zhang said.

To test the system, the team recruited seven legally blind participants to navigate a large, multi-room office environment. The robot would ask the user where they wanted to go (in this experiment, a conference room) and then present possible routes to the room and the time it would take to reach it.

Once the user selected a preferred route, the robot would guide them to the conference room, verbalizing the surroundings and obstacles along the way, such as “this is a long corridor,” until it reached the destination.

Following the test, the users completed a questionnaire about their experience, rating the system’s helpfulness, ease of communication, and usefulness. Overall, the participants said they preferred a combined approach, which included planning explanations and real-time narration from the robot. A simulated study of the system also showed that this approach was successful.

Similar robot guide dogs have been developed at the University of Glasgow, and past RoboBusiness Pitchfire winner Glidance created a wheeled assistive system.

Editor’s note: At the 2026 Robotics Summit & Expo on May 27 and 28 in Boston, there will be sessions on embodied AI and physical AI. Registration is now open.


SITE AD for the 2026 Robotics Summit save the date.

More studies to train dogs for daily life

The Binghamton University team said it plans to conduct more user studies, increase the system’s autonomy, and have the robots navigate longer distances, both indoors and outdoors.

The goal of this research is to help integrate robotic guide dogs into everyday life. The study participants were enthusiastic about this possibility, according to Zhang.

“They were super excited about the technology, about the robots,” he said. “They asked many questions. They really see the potential for the technology and hope to see this working.”

The paper, “From Woofs to Words: Towards Intelligent Robotic Guide Dogs with Verbal Communication,” was presented at the 40th Annual AAAI Conference on Artificial Intelligence, one of the largest academic AI conferences in history.

The post Binghamton researchers create robotic guide dogs that walk — and talk appeared first on The Robot Report.



View Source

Popular posts from this blog

Iván Hernández Dalas: 4 Show Floor Takeaways from CES 2019: Robots and Drones, Oh My!

Iván Hernández Dalas: How automation and farm robots are transforming agriculture

Iván Hernández Dalas: Physical Intelligence open-sources Pi0 robotics foundation model