Researchers in a Japanese-French joint robotics lab (CNRS-AIST Joint Robotics Laboratory) are working on ways to control robots via brain-computer interfaces (BCI). Their goal is to create devices that allow people to feel embodied in the body of a humanoid robot. To achieve this goal the scientists read the peoples’ thoughts via interpreting their brainwave signals. The research could one day give a paralyzed patient autonomy through a robotic avatar.The research is conducted in a joint laboratory established between a French public organization CNRS (Centre National de la Recherche Scientifique) and AIST (Advanced Industrial Science And Technology) and located in Tsukuba, Japan, at the Intelligent Systems Research Institute of the AIST.
The researchers from both countries are closely collaborating to pursue the means of increasing robot’s functional autonomy, using a humanoid robot as a main platform. The robot performs preset actions like walking forward, turning right or left. The BCI in development uses flashing symbols to control where the robot moves and how it interacts with the environment around it. EEG (electroencephalography) sensors read the user’s brainwave activity, and a signal processor interprets and sends the desired command to the robot.
The main research subjects include: task and motion planning and control, reactive behavior control, and human-robot cooperation through multimodal interface integrating brain-computer interface, vision and haptics.
The applications targeted are for tetraplegics or paraplegics to use this technology to navigate using the robot, and for instance, a paraplegic patient in Rome would be able to pilot a humanoid robot for sightseeing in Japan.