Faces are often considered the gateway to human emotion and identity. Our facial expressions allow us to convey social information to other humans and act as cues to the emotions we experience. Understanding and reacting appropriately to facial expressions and the emotions they communicate is an integral part of human culture.
Scientists believe that there are parts of the human brain that are specifically concerned with processing faces. If we can understand the processes that our brain uses to recognise faces, in particular moving faces and emotional expressions, we can start to build computer systems that mimic this useful ability.
Professor Peter McOwan, from Queen Mary, University of London, says:
“Robots are going to increasingly form part of our daily lives – for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes. Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible - understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness.”
The research relies on a very interdisciplinary approach, calling on the expertise of psychologists and computer scientists. Visitors to the exhibition will be able to take part in an experiment to try and recognise people’s faces when the gender is switched; discover what they look like when their face movements are ‘gender swapped’ to another and explore how robots with faces communicate with us socially.