Scientists have managed to use fMRI scans of a human to control the movements of a robot body that allowed the researchers to control a robot in France from a brain scanner in Israel. New Scientist reports the unprecedented achievement, which is the culmination of research which hopes to give people who are “locked in” the chance to interact with the world using a surrogate body.
Scientists have made a robot move on a human’s behalf by monitoring thoughts about movement. Situated inside an fMRI scanner in Israel, Tirosh Shapira has controlled a humanoid robot some 2000 kilometers (1250 miles) away, at the Béziers Technology Institute in France, using just his mind. The person controlling the robot could also see through the eyes of his electronic surrogate.
The fMRI (functional magnetic resonance imaging) reads his thoughts, a computer translates those thoughts into commands, and then those commands are sent across the internet to the robot in France. The system requires training that a particular “thought” (blood flow pattern) equates to a certain command. In this case, when Shapira thinks about moving forward or backward, the robot moves forward or backward; when Shapira thinks about moving one of his hands, the robot surrogate turns in that direction.
To complete the loop, the robot has a camera on its head, with the image being displayed in front of Shapira. Speaking to New Scientist, it sounds like Shapira really became one with the robot: “It was mind-blowing. I really felt like I was there, moving around,” he says. “At one point the connection failed. One of the researchers picked the robot up to see what the problem was and I was like, ‘Oi, put me down!’”
Using brain scanners is a step beyond current efforts to link up men and machines. Much recent work involved teleoperated robots in which humans manipulate controls, such as joysticks, to make a robot move.
The experiment helping to prove the technology works linked up student Tirosh Shapira who was in a lab at Bar-Ilan University, Israel, with a small two-legged robot thousands of kilometres away in France.
Prior to connecting the two, researchers made Mr Shapira think about different sorts of movements and developed software that could quickly spot his intention.
The result, reported the magazine, was that he could control the robot in almost real time.
The illusion of embodiment was tested by surprising Mr Shapira with a mirror so he could see his robot self – a test that convinced him he was present in the French lab.
The next step for the research is to refine it to use a different type of scanning that can work using a skull cap rather than an fMRI machine that a person has to lie in. The robot used to represent a human is to be upgraded to a version that has a similar stature and gait to a real person.
The research is part of an international project called Virtual Embodiment and Robotic Re-Embodiment that aims to refine ways to link people and surrogates in both virtual environments and the real world.
Work is being done on medical applications of the technology but the researchers warned that it was a long way from being able to help anyone yet. But in a few years, maybe, you might be able to put a brain-computer interface over your head, or any other wearable device like a BCI-compatible Google Glass, and control your a robot avatar in the physical world.