Shropshire Star

This robot performs actions controlled by human thought

Researchers have designed a system that allows machines to use a person’s brainwaves and hand gestures to do tasks.

Published
Controlling robot with the mind.

Controlling robots with the mind isn’t as implausible as it sounds – if the technology developed by a team of US engineers is anything to go by.

Researchers at the Massachusetts Institute of Technology (MIT) have designed a system that allows a robot to uses a person’s brain waves and hand gestures to make decisions and act on them.

The team at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) used Baxter, a humanoid robot created by robotics company Rethink Robotics, for their research.

They developed a system that allows humans to direct machines to perform tasks just by thinking about them.

If the person notices an error as the robot performs a task, he or she can use hand gestures to “scroll through and select the correct option for the robot to execute”, according to the researchers.

The system uses combination of electroencephalography (EEG), a test used to evaluate the electrical activity in the brain, and electromyography (EMG), a system that measures the signals produced by motor neurons.

Daniela Rus, director of CSAIL, said: “This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback.

“By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

The researchers say their technology helped improve Baxter’s decision-making from 70% to 97%.

Joseph DelPreto, a PhD student at MIT and lead author of the study, said: “What’s great about this approach is that there’s no need to train users to think in a prescribed way.

“The machine adapts to you, and not the other way around.”

Controlling robot with the mind.
(MIT CSAIL/YouTube)

The team believes the system could one day be useful for the elderly, or workers with language disorders or limited mobility.

Ms Rus said: “We’d like to move away from a world where people have to adapt to the constraints of machines.

“Approaches like this show that it’s very much possible to develop robotic systems that are a more natural and intuitive extension of us.”

The research is published in an open-access paper.

Sorry, we are not accepting comments on this article.