Skip to content
Jump to

Overview

Dr Rana el Kaliouby - Massachusetts Institute of Technology
Professor Peter Robinson - University of Cambridge

Can you read minds? The answer is most likely yes. You may not consider it mind reading but our ability to understand what people are thinking and feeling from their facial expressions and gestures is just that. 'People express their mental states all the time through facial expressions, vocal nuances and gestures', says Peter Robinson of the Computer Laboratory at the University of Cambridge. 'We have built this ability into computers to make them emotionally-aware.'

The ability to attribute mental states to others from their behaviour and then to use that information to guide our own actions or predict those of others is known as the theory of mind'. Although research on this theory has been around since the 1970s, it has recently gained attention due to the growing number of people with Autism conditions, who are thought to be mind-blind'. That is, they have difficulty interpreting others emotions and feelings from facial expressions and other non-verbal cues.

Peter and his colleague Rana el Kaliouby based their computer program on the latest research in the theory of mind by Simon Baron-Cohen, Director of the Autism Research Centre also at the University of Cambridge. Simon's research provided us with a taxonomy of facial expressions and the emotions they represent', explains Peter. In 2004, Simon published the Mind Reading DVD, an interactive computer-based guide to reading emotions from the face and voice. The DVD contains videos of people showing as many as 412 different mental states. Peter and Rana developed computer programs that can read facial expressions using machine vision, and then infer emotions using probabilistic machine learning trained by examples from the DVD.

'Machine vision is getting machines to see', giving them the ability to extract, analyse and make sense of information from images or video, in this case footage of facial expressions. Probabilistic machine learning describes the mechanism of enabling a machine to learn an association between features of an image such as facial expression and other classes of information, in this case emotions from training examples. The most likely interpretation of the facial expressions is then computed using probability theory.

'Machine versus people testing of this system has shown the computer to be as accurate as the top 6% of people. But would we want computers that can react to our emotions? Such systems do raise ethical issues', says Peter. Imagine a computer that could pick the right emotional' moment to try to sell you something'. There are, however, applications with clear benefits including an emotional hearing aid to assist people with autism, usability testing for software, feedback for on-line teaching, and informing the animation of cartoon figures.

'We have been working since 2004 on a wearable system that helps people with Autism Spectrum Conditions and Asperger Syndrome, with emotional-social understanding and mind reading functions', says Peter. Rana is currently implementing the first prototype of the system at the Massachusetts Institute of Technology's Media Lab'.