Skip to content
About the Royal Society

New mic to make robots better listeners

30 June 2015

Scientists have worked out a way to process sound from a microphone to give super-human hearing that can zoom in on conversations in busy rooms. The microphone and sound processors, developed to help improve interactions with artificial intelligence in noisy spaces, will be demonstrated by Imperial College Scientists at the Royal Society’s Summer Science Exhibition.

The technology could one day be used in robotic assistants in hospitals and other busy environments to listen to patients and take instructions just like a human would, says Dr Patrick Naylor, lead researcher of the project at Imperial College London. Just like a camera can visually zoom in, the microphone system can audio zoom with super-human hearing to listen to conversations in noisy spaces as if they were much closer- filtering out unwanted noise.

The zoom-in system combines 32 microphones round a sphere that can listen in on sounds coming from any chosen direction. The zoom mic measures the tiny differences in how long it takes sound to reach each of the microphones around the device and uses an algorithm to compare them. The differences help the mic figure out what sound is coming from where so that it can then tune into sounds coming from one particular spot.

‘Being able to pick out particular conversations or voices in a crowd is a real challenge for everyday devices like phones and hearing aids,’ explains Dr Naylor, research lead at Imperial College London.

‘Humans have an extraordinary ability to tune in to particular sounds, picking voices out of a noisy environment. Complex processing in the brain learns to ignore unwanted sounds like traffic or other chatter in order to pick out the important sounds we want to hear. Artificial intelligence isn’t as smart as that.  At the moment robots, phones and other devices using speech recognition don’t work well when you’re not close to the microphone or in a quiet space because there’s just too much noise. Until now microphones haven’t been able to separate one sound or conversation from another in 3D.’

Audio-zoom microphones could change the way we interact with artificial intelligence in real life- making robots and other AI better listeners. As well as being used for robotic assistants in noisy spaces this audio-zoom tech could be used in mobile phones to track your voice around the room when on speaker and to improve hearing aids to focus on conversations in loud environments.

‘Being able to listen selectively, focusing on one person, is vital for human communication. Until artificial intelligence can listen to different parts of the soundscape going on around it and pick out important conversations AI will never properly be able to interact and converse with humans in noisy real-world situations’ says Dr Naylor. ‘It’s a big challenge to be able to take away all the extra noises completely but this is a first step towards that’.

Sounds constantly bounce off surfaces, reverberating around a space. This reverberation is one of the biggest challenges facing scientists who are trying to design interfaces that can listen to particular conversations. The microphone is part of a larger research field hoping to ‘dereverberate’ sound to help robots, mobile phones and other artificial intelligence to not just hear what’s going on around them but to actually listen to a particular person.

Visitors can try out the zoom mic, zooming in to sounds around the room, at the Royal Society’s Summer Science Exhibition where Dr Patrick Naylor and colleagues are on hand to explain the science behind the mic. Visitors can also meet a robot and give it spoken instructions. The team are giving a demonstration of how sound reverberates in different spaces too - demonstrating the challenge of separating noisy echoes from their source.