Most able-bodied people take their ability to perform simple daily tasks for granted—when they reach for a warm mug of coffee, they can feel its weight and temperature and adjust their grip accordingly so that no liquid is spilled. People with full sensory and motor control of their arms and hands can feel that they’ve made contact with an object the instant they touch or grasp it, allowing them to start moving or lifting it with confidence.
But those tasks become much more difficult when a person operates a prosthetic arm, let alone a mind-controlled one.
In a paper published in Science, a team of bioengineers from the University of Pittsburgh Rehab Neural Engineering Labs describe how adding brain stimulation that evokes tactile sensations makes it easier for the operator to manipulate a brain-controlled robotic arm. In the experiment, supplementing vision with artificial tactile perception cut the time spent grasping and transferring objects in half, from a median time of 20.9 to 10.2 seconds. McGowan Institute for Regenerative Medicine affiliated faculty member Michael Boninger, MD, Professor in the Department of Physical Medicine & Rehabilitation at the University of Pittsburgh, School of Medicine with joint appointments in the Departments of Bioengineering, Rehabilitation Science and Technology, and the Clinical and Translational Science Institute, is a co-author on this study.
“In a sense, this is what we hoped would happen—but perhaps not to the degree that we observed,” said co-senior author Jennifer Collinger, PhD, associate professor in the Pitt Department of Physical Medicine and Rehabilitation. “Sensory feedback from limbs and hands is hugely important for doing normal things in our daily lives, and when that feedback is lacking, people’s performance is impaired.”
Study participant Nathan Copeland, whose progress was described in the paper, is the first person in the world who was implanted with tiny electrode arrays not just in his brain’s motor cortex but in his somatosensory cortex as well—a region of the brain that processes sensory information from the body. Arrays allow him to not only control the robotic arm with his mind, but also to receive tactile sensory feedback, which is similar to how neural circuits operate when a person’s spinal cord is intact.
“I was already extremely familiar with both the sensations generated by stimulation and performing the task without stimulation. Even though the sensation isn’t ‘natural’—it feels like pressure and gentle tingle—that never bothered me,” said Mr. Copeland. “There wasn’t really any point where I felt like stimulation was something I had to get used to. Doing the task while receiving the stimulation just went together like PB&J.”
After a car crash that left him with limited use of his arms, Mr. Copeland enrolled in a clinical trial testing the sensorimotor microelectrode brain-computer interface (BCI) and was implanted with four microelectrode arrays developed by Blackrock Microsystems (also commonly referred to as Utah arrays).
This paper is a step forward from an earlier study that described for the first time how stimulating sensory regions of the brain using tiny electrical pulses can evoke sensation in distinct regions of a person’s hand, even though they lost feeling in their limbs due to spinal cord injury. In this new study, the researchers combined reading the information from the brain to control the movement of the robotic arm with writing information back in to provide sensory feedback.
In a series of tests, where the BCI operator was asked to pick up and transfer various objects from a table to a raised platform, providing tactile feedback through electrical stimulation allowed the participant to complete tasks twice as fast compared to tests without stimulation.
In the new paper, the researchers wanted to test the effect of sensory feedback in conditions that would resemble the real world as closely as possible.
We didn’t want to constrain the task by removing the visual component of perception,” said co-senior author Robert Gaunt, PhD, associate professor in the Pitt Department of Physical Medicine and Rehabilitation. “When even limited and imperfect sensation is restored, the person’s performance improved in a pretty significant way. We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be.”
Read more…
UPMC/University of Pittsburgh Schools of the Health Sciences News Release
University of Pittsburgh Swanson School of Engineering News Release
Abstract (A brain-computer interface that evokes tactile sensations improves robotic arm control. Sharlene N. Flesher, John E. Downey, Jeffrey M. Weiss, Christopher L. Hughes, Angelica J. Herrer, Elizabeth C. Tyler-Kabara, Michael L. Boninger, Jennifer L. Collinger, Robert A. Gaunt. Science, 21 May 2021: Vol. 372, Issue 6544, pp. 831-836.)