Case study: Tom Mitchell
Lecturer in Computer Music, Associate Professor in the Department of Computer Sciences and Creative Technologies, University of the West of England, Royal Society APEX Award Winner 2018
“I focused on science and technology throughout my education but always had a passion for music. Now I’m able to combine these domains, creating the technology that connects these worlds.”
I’m a computer scientist specialising in human computer interaction and audio. Humans are multi-sensory beings: we see, hear and feel our way around the world.
My work involves developing ways of interacting with technology that engage a wide range of senses, and in particular hearing. One aspect of this work is music performance using gestures rather than traditional instruments. I created the MI.MU gloves along with Imogen Heap, which enable musicians to make music with hand gestures using motion tracking and AI techniques. Another aspect of my research is ‘sonification’: the use of non-speech audio to enhance visual representations of data. This helps scientists extract information they might otherwise miss with their eyes alone. I am currently helping molecular scientists design drugs by creating immersive sonification algorithms for VR on a project called ISOMORPH.
My science and engineering training has always fed my creative interests. As a teenager I wanted to be a musician, but with an aptitude for technology I spent most of my time programming tools and algorithms that support music making rather than actually making music. I went to a comprehensive in Essex and I loved art but gave it up because the syllabus was so traditional, with no connection to technology. I think learning programming in the context of art would really motivate students to learn to code, I can’t understand why these subjects are always separated.
Over the years I’ve worked on lots of cross-disciplinary projects and learned a great deal from artists and musicians who take a practice-based approach to research. It’s a fun challenge getting to grips with other disciplines, working collaboratively to satisfy creative and scientific aims. For example, there have been so many complications to overcome in the MI.MU project relating to performance, music, textiles, choreography, engineering and software.
In the same meetings we are often discussing interconnected issues relating to e-textiles, machine learning and wireless connectivity. It can be stressful at times but seeing musicians express their creative ideas so fluidly is enough to keep me motivated.
I studied Electronic Engineering at university, including a placement year at an audio engineering company that made me realise I wanted to develop my software skills. After graduating, I worked briefly as an embedded software engineer before returning to university to study for a PhD, where I developed AI techniques to evolve synthesisers that mimic instrumental sounds. I then got a Lectureship at the University of the West of England, Bristol where I’ve been since, progressing to Senior Lecturer and now Associate Professor.