Newswise — Reaching for something on the top shelf in the grocery store or brushing one's teeth before bed are tasks many people can do without thinking. But doing these same tasks as an upper limb amputee, while using a prosthetic device, can require more mental effort.

Dr. Maryam Zahabi, assistant professor in the Department of Industrial and Systems Engineering at Texas A&M University, and her team are studying machine learning algorithms and computational models to provide insight into the mental demand placed on individuals using prosthetics. These models will improve the current interface in these prosthetic devices.

The researchers are studying prosthetics that use an electromyography-based human-machine interface. Electromyography (EMG) is a technique which records the electrical activity in muscles. This electrical activity generates signals that trigger the interface, which translates them into a unique pattern of commands. These commands allow the user to move their prosthetic device.

Unfortunately, using such prosthetics can be mentally draining for upper limb amputees – even for accomplishing simple, daily tasks like operating a toothbrush.

“There are over 100,000 people with upper limb amputations in the United States,” Zahabi said. “Currently there is very little guidance on which features in EMG-based human-machine interfaces are helpful in reducing the cognitive load of patients while performing different tasks.”

Testing different interface prototypes, through virtual reality and driving simulations, will allow researchers to provide guidance to the engineers creating these interfaces. This will lead to better prosthetics for amputees and other technological advances using EMG-based assistive human-machine interfaces.

This research is a collaboration between Texas A&M, North Carolina State University and The University of Florida and is supported by the National Science Foundation.