Dr Sophie Scott, Professor Stuart Rosen, Dr Andrew Faulkner and Yi Yui Meng.
University College London.
Professor Richard Wise, Dr Jane Warren and Galina Spitsyna.
Hammersmith Hospital.
Charvy Narain.
John Radcliffe Hospital.
We live in a world of constant chatter. Speech, in the forms of conversations, phone calls, songs, radio and TV assails our ears, yet we somehow manage to make sense of it all without even thinking about it. How do our brains pick out words and meaning from this confusing cacophony of sound so effortlessly?
Researchers from University College London, the John Radcliffe Hospital in Oxford and Hammersmith Hospital in London have teamed up to find out how the brain performs this feat. As well as providing fascinating insights into how our brains work, the research could help us understand what happens when people are forced to re-learn their speech comprehension skills - following a stroke or being fitted with a kind of hearing aid called a cochlear implant, for example.
'Speech is a really complex sound,' explains psychologist Dr Sophie Scott, who leads the project. As well as understanding words, the brain also uses the way in which words are spoken, such as intonation and melody, to turn spoken language into meaning. This system also has to be robust and flexible enough to deal with variations in speech sounds such as regional accents.
The team is trying to find out how the brain uses all these different components to turn speech sounds into meaning. To do this, they are performing brain scans on volunteers to see which areas of their brains become active when the volunteers hear speech. They have found that different areas of the brain are responsible for interpreting the different components of speech, such as words and intonation. Intriguingly, people who speak different sorts of languages use their brains to decode speech in different ways.
The team has found that a part of the brain called the left temporal lobe (located by the left temple) becomes active when English speakers hear English. The team think that this region of the brain links speech sounds together to form individual words. But when Mandarin Chinese speakers perceive Mandarin, both the left and right temporal lobes are active.
Web links
See all exhibits from 2003