how thinking makes a robot dog walk

With just a thought, the machine autonomously plans its route, avoids obstacles and goes precisely to a designated location.

Now, users can control a robot dog with just their thoughts: the machine plans the route, avoids obstacles and navigates precisely to the destination.

At Xi’an Jiaotong University in northwest China, Professor Xu Guanghua and his team have made this science fiction scene a reality by integrating electroencephalogram control into autonomous navigation.

At the heart of the achievement is the non-invasive brain-computer interface (BCI), which captures neuronal electrical signals to control mechanical devices, Xu explained.

He described the system as a kind of “remote control in your mind”. When the user forms an intention, such as “move on”the brain generates EEG signals that the system decodes, translates into commands and sends to the robot dog, which executes the movement.

The system supports 11 basic mental commands, such as forward, backward and turn, with potential for expansion. The recognition accuracy exceeds 95% and the delay between thought and action is about one second.

Amid a global surge in BCI research, invasive technologies offer high precision but rely on surgical implantation, carrying risks of trauma, infection, immunological rejection and signal degradation over time – factors that make them expensive and difficult to scale up.

In contrast, the non-invasive approach chosen by Xu’s team is safe, cost-effective, easy to use and suitable for a wide range of applications, particularly in rehabilitation medicine and consumer settings.

However, non-invasive signals are less precise, making continuous real-time control difficult. Xu noted that requiring manual control of each movement would be difficult and generate intense mental strain, defeating the purpose of technological empowerment.

To solve this, the team adopted a human-machine collaboration model with defined roles. “Humans are only responsible for issuing high-level intentions, like where to go — the part of decision-making that the brain handles most easily.”Xu said.

“Meanwhile, high-precision, high-speed repetitive tasks such as autonomous navigation, environmental perception, dynamic obstacle avoidance, and movement execution are handled entirely by the machine’s own intelligent systems.”declared Xu.

The approach increases efficiency and stability, overcomes limitations of non-invasive signals and maximizes complementarity between human decision and machine execution, bringing BCI closer to practical application.

Xu said advancing BCI requires progress in core technologies and integration with areas such as AI, autonomous navigation and intelligent perception. Your team follows this dual path: innovating to overcome deficiencies in non-invasive interfaces and aligning development with real needs.

Xu envisions brain-computer interaction systems that seamlessly combine human decision-making with machine intelligence, ultimately making robots capable assistants in everyday life.

He stated that the robot dog holds promise as an aid for people with disabilities, as well as for applications in elderly care, medical assistance, rehabilitation training and intelligent companionship.


This text was published by Xinhua Agency at 1:20 pm on March 30, 2026 and adapted for publication by Poder360.