Software tweak could give prosthetic users more options to carry out daily activities

02 July 2018

Updating the software in the chips used by modern prosthetic hands could raise the number of actions amputees can make with their artificial hands from two to eight. The technology, which enables people to move prosthetic hands by using nerves in their forearms is being unveiled at Newcastle and Keele Universities’ exhibit at the Royal Society Summer Science Exhibition in London.

“While the holy grail of prosthetic research – to develop a prosthetic hand that can offer control of individual finger movements, as well as sense pressure, temperature and transmit the information back to the brain – is not in the realms of science fiction, it is still decades away,” says Dr Kianoush Nazarpour, Reader in Biomedical Engineering at Newcastle University. 

“There are steps that we can take in the shorter term like updating prosthetic software to help users train their brain and learn new skills with the muscles of their stump. When trained correctly, it would enable higher levels of function for people who have lost their limbs.”

Human fingers are mostly bone attached to tendons and muscles in the forearm. When we think about moving our hands, our brain uses our nervous system to send electric signals to contract the forearm’s muscles. This happens even if the hand is not actually there.

Prosthetic hands have evolved quickly in recent years, and have the same number of joints as human hands, as well as individually moving fingers and thumbs. However, while they look human, prosthetic hand movements are limited and robotic. Users can open or close the fingers, they can bend or rotate the wrist, or they can move the thumb in and out – but not all at the same time. This makes the use of the hand unnatural and sometimes results in a substantial loss of independence.

The average prosthetic user has one sensor each placed on the two main muscles in the forearm. Flexing each of these muscles can instruct the prosthetic to carry out one action each – normally an opening and closing movement. Scientists at Newcastle have now tweaked prosthetic software to give users up to eight actions by contracting the muscles in different ways. When rolled out, the software update could be downloaded over Internet and give prosthetic users more choice to carry out their daily activities, such as being able to extend the index finger or closing their fist. 

Royal Society Summer Science Exhibition visitors will be able to attach the sensors to their forearm and control their very own robotic hand and play rock-paper-scissors. This is the first time the device has been used outside of the lab. The hand is part of the Progressive prosthetics exhibit, a joint venture by Newcastle and Keele universities, who are working together to develop a number of technologies to create a limb that more closely mirrors the natural form.

Also as part of the exhibition, researchers at Keele University have developed a new biomechanical model that will allow signals sent from the brain to the nerves in the forearm muscles to be used for controlling a prosthetic hand. The signals are interpreted by the computer model and relayed forwards to prosthetic hands in real time.

The next step in the team’s research is to help users regain sensation through prosthetics, as well as help their brain regain the ability of proprioception, which is how we know the space our limbs occupy. 

"There is one crucial aspect missing from current state-of-the-art prosthetic hands: feedback,” says Dr Ed Chadwick, Senior Lecturer in Biomedical Engineering at Keele University. “When we use our natural hands, we can feel what they are doing, we know what we are touching, how hard we are squeezing, and what position are hands are in - this last aspect is known as proprioception. Current prosthetic devices don’t feature this information, and this means prosthesis users have to pay far more attention to their movements than people using their natural hands.”

“We have created a computer model that addresses the way prosthetics feed back to the brain on sensations like muscle length and tendon force. Ultimately, we think this will lead to better performing devices for prosthesis users. With all the effort being put into this field around the world, continuous, natural control and integrated feedback is achievable and is not confined to the realms of science fiction.”

“We hope that one day this means prosthetic users will be able to reach out and pick up a glass while maintaining eye contact with a friend.”