True, a robotic arm parked next to his wheelchair did the touching, painstakingly, palm to palm. But Tim Hemmes made that arm move just by thinking about it. For the first time in the seven years since a motorcycle accident left him a quadriplegic, Mr. Hemmes was reaching out to someone – even if it was only temporary, part of a monthlong science experiment at the University of Pittsburgh, the same institution where a brain-computer interface research was funded by $800,000.
“It wasn’t my arm but it was my brain, my thoughts. I was moving something,” Mr. Hemmes says. “I don’t have one single word to give you what I felt at that moment. That word doesn’t exist.”
The goal is a “Star Trek”-like melding of mind and machine, combining one of the most humanlike bionic arms, where even the fingers bend like real ones, with tiny chips implanted in the brain. Those electrodes tap into electrical signals from brain cells that command movement. Bypassing a broken spinal cord, they relay those signals to the robotic third arm. This research is years away from commercial use, but numerous teams are investigating different methods.
As we’ve reported before at Pittsburgh University monkeys learned to feed themselves marshmallows by thinking a robot arm into motion. At Duke University, monkeys used their thoughts to move virtual arms on a computer and got feedback that let them distinguish the texture of what they “touched.”
“We really are at a tipping point now with this technology,” says Michael McLoughlin of the Johns Hopkins University Applied Physics Laboratory, which developed the humanlike arm in a $100 million project for DARPA, the Pentagon’s research agency.
Pittsburgh is helping to lead a closely watched series of government-funded studies over the next two years to try to find out. A handful of quadriplegic volunteers will train their brains to operate the DARPA arm in increasingly sophisticated ways, even using sensors implanted in its fingertips to try to feel what they touch, while scientists explore which electrodes work best.
“Imagine all the joints that are in your hand. There’s 20 motions around all those joints,” says Pittsburgh neurobiologist Andrew Schwartz. “It’s not just reaching out and crudely grasping something. We want them to be able to use the fingers we’ve worked so hard on.”
The 30-year-old Mr. Hemmes’ task was a much simpler first step. He was testing whether a new type of chip, which for safety reasons the Food and Drug Administration let stay on this initial volunteer’s brain for just a month, could allow for three-dimensional arm movement.
He surprised researchers the day before the electrodes were removed. The robotic arm whirred as Mr. Hemmes’ mind pushed it forward to hesitantly tap palms with a scientist.
Then his girlfriend beckoned. The room abruptly hushed. Mr. Hemmes painstakingly raised the black metal hand again and slowly rubbed its palm against hers a few times.
Tim Hemmes’ operation took two hours. He had practiced imagining arm movements inside brain scanners, to see where the electrical signals concentrated. That’s where neurosurgeon Elizabeth Tyler-Kabara cut, attaching the chip through an inch-wide opening on the left side of Mr. Hemmes’ skull.
Two days later, Mr. Hemmes was hooked to a computer, beginning simple cursor movements. The next week, it was time to test if he could trigger real-life movement using the arm. He reclined in his wheelchair, the robot arm bolted to a steel rod nearby. The task: make the arm reach out to grasp a ball mounted on a board.
The arm whirs forward, then stops, then goes again, then suddenly pulls back. “It’s doing the opposite of what I ask it do,” Mr. Hemmes says in frustration. “When I think about reaching back, it goes forward.” Dr. Wei Wang, a member of the research team, watches Mr. Hemmes’ brain patterns on a nearby computer screen, trying to match them to the robotic movements. Focus on your elbow, he advises.
Mr. Hemmes takes a deep breath and tries. The arm whirs forward this time, reaching the ball. The fingers clench around it.