Brain implant revives some feelings of touch in a paralyzed man

Mind-controlled robot arms can now generate feelings of touch, based on new research from the University of Pittsburgh Medical Center. The study, published today in Science Translational Medicine, represents a first for brain-computer interfaces and fulfills a major stage in creating robotic prosthetic arms for tetraplegics that can hold objects.

“One of the reasons providing sensation is really important is when you reach out to pick something up, it’s that sense of touch that allows you to hold the object properly,” Robert Gaunt, the project’s leader and a physical medicine and rehabilitation researcher at Pitt, told the NewsHour. That’s because to touch an object like an apple, your brain requires two things: movement and feeling. Movement wraps your hand around the apple, while feeling registers what’s happening.

A decade ago, one of Gaunt’s subjects — Nathan Copeland — broke his neck after a car accident on a rainy night. The incident caused severe spinal damage that left Copeland, who resides in western Pennsylvania, unable to feel or move his legs and lower arms. At the time, he was an 18-year-old freshman in college, and soon after the accident, he enrolled in clinical trials at Pitt Medical Center.

Copeland ultimately agreed to an experiment to revive touch in his hands. Here’s how it works. The brain region behind movement of the hand — the motor cortex — sits right next to the area in charge of feeling — the somatosensory cortex. In 2012, researchers in Gaunt’s department showed how a paralyzed person could control a robotic arm via a brain chip plugged into their motor cortex. (Remember the viral video of Jan Scheuermann eating chocolate?)

Brain scanning with magnetoencephalography identifies regions in the somatosensory cortex that were responsive when Nathan Copeland imagined and/or was touched on the palm (yellow), little finger (orange), index finger (purple) and thumb (red).

Brain scans identified regions in the somatosensory cortex that were responsive when Nathan Copeland imagined and/or was touched on the palm (yellow), little finger (orange), index finger (purple) and thumb (red).

Gaunt’s team merely added a second brain chip into the somatosensory cortex, to create the feedback associated with movement. They made scans of Copeland’s brain, while researchers stroked his real fingers or as he imagined such touching taking place.

The team noted which somatosensory areas subsequently lit up — creating a map so they knew where to insert the feedback implant.

This device sticks tiny electrodes into brain tissues that beam pulses of electricity into the somatosensory cortex when the scientists move or touch corresponding parts of a robotic hand. Gaunt said the first three weeks were nerve-wracking because Copeland felt nothing outside random sensations:

“So you can imagine, we’d been working on this project for years getting ready for it, and my apprehension was building. We have 64 electrodes, little contacts implanted in the brain, we scan through each one, stimulate, and say ‘Did you feel anything? Did you feel that?’ And then finally after about three weeks he paused for a second and instead of saying no, he said ‘Yeah, yeah I felt that.’ And I just felt a great sense of relief, and when that happened, a lot of people in the background were cheering, but he was very cool, calm and collected about the whole thing.”

Copeland reported the feelings as being strongest when the researchers touched the bases of his index, middle, ring and pinky fingers. The electrodes failed to create sensations in the fingertips or the thumb. These feelings continued without fading for six months, but the sensations are not quite on par with natural touch. In study surveys, Copeland described them as being “almost like if you pushed there, but I didn’t quite feel…the touch,” or as “almost natural.”

Still, when the team blindfolded Copeland and asked him to identify fingers while they were being touched, Copeland successfully did so 84 percent of the time.

The project’s next step merges movements with feelings in the robotic, Gaunt said. Right now, they’re conducting experiments where Copeland reaches for objects, grasps them and then reports how they feel.

“One takeaway is even in cases where people have had these serious injuries, the brain itself remains healthy. It’s just a person’s ability to get that brain to communicate with the outside world that in some cases has been lost,” Gaunt said. So with this technology we can start to bridge that gap and enable a person who hasn’t felt anything maybe in a long, long time to feel things again.”

Leigh Anne Tiffany contributed to reporting this story.