Hands-on at the exhibit
- Control the movements of a mechanical hand
- Use virtual reality to see inside your arm as you move your own hand
- Play rock-paper-scissors using a robotic arm, via the electrical activity of your own muscles
Find out more
The quest to develop a prosthetic hand that looks, moves and feels like the real thing.
Modern prosthetic hands look human, but their movements are limited and robotic. Researchers are striving to transform prosthetics by recreating the natural brain-to-hand control signals. At this exhibit you’ll discover the complexity of movements and signals involved in hand control, and how these may be applied in prosthetics.
Current prosthetic hands have the same number of joints as human hands and could theoretically perform complex movements. However, this may not be possible until we understand the complete set of signals that a human hand normally receives. Researchers at Keele and Newcastle Universities are developing ways to convert the user’s desired action into movement of the prosthetic hand, and to generate sensory feedback from the prosthesis to the brain. One way to do this is using machine learning to combine muscle and motion signals from the residual limb. Another way is to feed electrical signals from residual muscles through a computer model to interpret the desired action. Their goal is that one day, prosthetic hands will move and feel like the real thing.
Find out more about the development of this exhibit, SenseBack, healthcare technologies research at Keele University and Newcastle University’s intelligent sensing laboratory.
Presented by: Keele University, Newcastle University, the New Vic Theatre and Wavemaker.