trendingNowenglish1458268

Robot that reads your mind to train itself

Researchers at the University of Washington hope to use brain-computer interface technology to teach robots new skills directly via brain signals.

Robot that reads your mind to train itself
Researchers at the Neural Systems Laboratory, University of Washington, hope to take brain-computer interface (BCI) technology to the next level by attempting to teach robots new skills directly via brain signals.

Robotic surrogates that offer paralysed people the freedom to explore their environment, manipulate objects, or simply fetch things have been the holy grail of BCI research for a long time.

The researchers began by programming a humanoid robot with simple behaviours which users could then select with a wearable electroencephalogram (EEG) cap that picked up their brain activity.

The brain generates what is known as a P300, or P3, signal involuntarily each time it recognises an object. This signal is caused by millions of neurons firing together in synchronised fashion.

The team's initial goal was for the user to send a command to the robot to process into a movement.

“But this required programming the robot with a pre-defined set of very basic behaviours,” the BBC quoted Rajesh Rao, senior investigator, as saying.

Rao's latest robot prototype is Mitra, meaning friend. It's a two-foot-tall humanoid that can walk, look for familiar objects, and pick up or drop off objects. The team is building a BCI that could be used to train Mitra to walk to different locations within a room.

The team reasoned that giving the robot the ability to learn might just be the trick to allow a greater range of movements and responses.

Designing a truly adaptive brain-robot interface that would allow paralysed patients to directly teach a robot to do something could be immensely helpful, liberating them from the need to use a mouse and keyboard, or touch-screen, designed for more capable users.

Using BCIs can also be a time-consuming and clumsy process, since it takes a while for the system to accurately identify the brain signals.

Once a person puts on the EEG cap, she could choose to either teach the robot a new skill or execute a known command through a menu.

In the ‘teaching’ mode, machine-learning algorithms could be used to map the sensor readings the robot gets to appropriate commands.

If the robot is successful in learning the new behaviour, then the user can ask the system to store it as a new high-level command that will appear on the list of available choices the next time.

"The resulting system is adaptive and hierarchical — adaptive because it learns from the user and hierarchical because new commands can be composed as sequences of previously learned commands," Rao said.

The major challenge at the moment is getting the system to be accurate, given how noisy EEG signals can be.

"While EEG can be used to teach the robot simple skills such as navigating to a new location, we do not expect to be able to teach the robot complex skills that involve fine manipulation, such as opening a medicine bottle or tying shoelaces," said Rao.

It may be possible to attain a finer degree of control either by using an invasive BCI or by allowing the user to select from videos of useful human actions that the robot could attempt to learn from.

LIVE COVERAGE

TRENDING NEWS TOPICS
More