Engineering researchers from North Carolina State University report that they have developed new technology designed to decode neuromuscular signals and control powered, prosthetic wrists and hands.
In developing the technology, the researchers from the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill relied on computer models that closely mimic the behavior of the natural structures in the forearm, wrist, and hand.
Current state-of-the-art prosthetics rely on machine learning to create a “pattern recognition” approach to prosthesis control. This approach requires users to “teach” the device to recognize specific patterns of muscle activity and translate them into commands—such as opening or closing a prosthetic hand.
“Pattern recognition control requires patients to go through a lengthy process of training their prosthesis,” says He (Helen) Huang, a professor in the joint biomedical engineering program at North Carolina State University and the University of North Carolina at Chapel Hill. “This process can be both tedious and time-consuming,” in a media release from North Carolina State University.
“We wanted to focus on what we already know about the human body,” adds Huang, senior author of a study about the technology, published recently in IEEE Transactions on Neural Systems and Rehabilitation Engineering. “This is not only more intuitive for users, it is also more reliable and practical.
Instead, the researchers developed a user-generic, musculoskeletal model. The researchers placed electromyography sensors on the forearms of six able-bodied volunteers, tracking exactly which neuromuscular signals were sent when they performed various actions with their wrists and hands. This data was then used to create the generic model, which translated those neuromuscular signals into commands that manipulate a powered prosthetic, the release explains.
“When someone loses a hand, their brain is networked as if the hand is still there,” Huang says. “So, if someone wants to pick up a glass of water, the brain still sends those signals to the forearm. We use sensors to pick up those signals and then convey that data to a computer, where it is fed into a virtual musculoskeletal model.
“The model takes the place of the muscles, joints and bones, calculating the movements that would take place if the hand and wrist were still whole. It then conveys that data to the prosthetic wrist and hand, which perform the relevant movements in a coordinated way and in real time—more closely resembling fluid, natural motion.”
The technology’s potential applications are not limited to prosthetic devices, per the release.
“This could be used to develop computer-interface devices for able-bodied people as well,” Huang states. “Such as devices for gameplay or for manipulating objects in CAD programs.”
In preliminary testing, both able-bodied and amputee volunteers were able to use the model-controlled interface to perform all of the required hand and wrist motions—despite having very little training.
“We’re currently seeking volunteers who have transradial amputations to help us with further testing of the model to perform activities of daily living,” Huang says. “We want to get additional feedback from users before moving ahead with clinical trials.”
[Source(s): North Carolina State University, Science Daily]