Screen Shot_2012-12-17_at_10.20.02_AMResearchers from the University of Pittsburgh School of Medicine and UPMC have developed a mind-controlled robot arm that recently allowed a female patient with quadriplegia to maneuver the device to feed herself and perform other complex motions of daily life. In a study appearing online in the Lancet, the researchers describe the brain-computer interface (BCI) technology and training programs designed to allow patient Jan Scheuermann, aged 53 years old, (pictured right) to intentionally move an arm, turn and bend a wrist, and close a hand for what researchers say was the first time in 9 years.

Andrew B. Schwartz, PhD, professor, department of neurobiology, Pitt School of Medicine, states that, “This technology, which interprets brain signals to guide a robot arm, has enormous potential that we are continuing to explore. Our study has shown us that it is technically feasible to restore ability; the participants have told us that BCI gives them hope for the future,” Schwartz says.

Scheuermann first heard about the Pitt/UPMC BIC research study in a video about patient Tim Hemmes, who sustained a spinal cord injury that left him with quadriplegia and ultimately used a robotic arm to reach out and touch his girlfriend. According to a recent news release, researchers performed screening tests in order to confirm Scheuermann’s eligibility for the current study. Following these tests, Elizabeth Tyler-Kabara, MD, PhD, assistant professor, department of neurological surgery, Pitt School of Medicine, reportedly positioned two quarter-inch-square electrode grids comprised of 96 contact points in the regions of Scheuermann’s brain that would normally control the right arm and hand movement.

In the release, Jennifer Collinger, PhD, assistant professor, department of physical medicine and rehabilitation (PM&R), research scientist for the VA Pittsburg Healthcare System, notes that the electrode points are engineered to detect signals from individual neurons, and computer algorithms are used to identify the firing patterns linked to a particular observed or imagined movements, such as raising or lowering the arm or turning the wrist.

In the 2 days following the procedure, researchers report that they then hooked up the two terminals protruding from Scheuermann’s skull to the computer. Collinger adds that this allowed researchers to see the neurons fire on the computer screen when Scheuermann thought about closing her hand. The study results indicate that within a week, Scheuermann was able to reach in and out, left and right, and up and down with the arm.

After 3 months, Scheuermann was able to guide the arm from a position 4 inches above a table to pick up blocks and tubes of different sizes, a ball, and a stone, and put them down a nearby tray. The training methods and algorithms used in primate models and in working with Scheuermann indicate that it may be possible for individuals with long-term paralysis to recover natural, intuitive command signals to orient a prosthetic hand and arm to allow meaningful interaction with the environment, Schwartz empathizes.

Photo Credit: UPMC

[Source: UPMC]