Q&A: The sound catcher

Journal name:
Date published:
Published online

Tom Mitchell uses engineering and computing to enable people to play and sample live music using gestures. With the latest version of his co-creation 'The Gloves' about to debut at TEDGlobal 2012 in Edinburgh, UK, he talks about adaptive musical interaction.

TEDGlobal 2012

Edinburgh International Conference Centre, UK.
25–29 June 2012.

How do 'The Gloves' work?

The team combined data gloves to measure finger flexion, inertial devices to measure orientation and optical sensors to measure position. Gestures enable a range of tasks. You can 'catch' your voice by singing into your hand, where a microphone records and loops phrases. Opening the hand allows a synthesizer to be played; percussive gestures trigger drums. Reaching for the sky adds reverb; the heavy-metal sign adds distortion.

How did you start to work with Imogen Heap?

I met Imogen at a wedding, and started helping her to use technology in performances. She would manipulate audio using keyboards, drum pads and controllers. Then she met Elena Jessop at the Massachusetts Institute of Technology in Cambridge, who had invented a glove to freeze the timbre of a singing voice. Imogen wanted the audience to relate her actions to the production of music, and asked for a live music system reliant on gestures. The result, she says, lets her perform more fluidly; it is also faster and more intuitive than using a keyboard or computer to compose.


Imogen Heap produces music with gestures, thanks to Tom Mitchell's glove technology.

What is your 'evolutionary' approach to audio synthesis'?

I wanted to build an algorithm to reproduce a target sound, such as a cello, voice or sound effect. I used an optimization method modelled on Darwinian natural selection. My program synthesizes sounds, selects those that most closely match the target then recombines and mutates their 'genes' to engender the next generation, advancing towards an optimal solution.

Does that come into your gestural work?

I have plans for experiments that will allow users to describe a timbre using gestures, which can be matched using evolutionary synthesis, allowing 'sound sculpting'.

How did you become interested in electronics and music?

I rejected classical music training but was drawn to machines. In the 1980s, I wrote accompaniments for my sister, a pianist, using a Spectrum computer. I borrowed equipment from electronic-music artist Tom Jenkinson (later recording as Squarepusher), who exploited his equipment's idiosyncrasies by using a hacked cassette machine to simulate a multitrack recorder, or a mixing console to create feedback loops. I built Musical Instrument Digital Interface (MIDI) programs to control synthesizers. Pursuing electronic engineering at university, I studied audio sampling and synthesis, and completed a doctorate combining music and computer science.

What other controllers would you like to see?

We've come a long way since the theremin, the first electronic gestural music controller. Examples from recent years include the Eigen Harp and Karlax, whose sensor-driven interfaces resemble traditional instruments. I would like to see a move away from garments and sensors that constrain movement; camera-based devices have pushed things forward. I can't wait to get my hands on Leap by Leap Motion: a device that allows you to control a computer with fine hand and finger movements. It would be nice for people to be able to program their own systems using any gestures, and I look forward to interfaces as adaptive and versatile as touch screens.

Could this technology be put to other uses?


Tom Mitchell

Yes. The gestural-interface system that we are working on could have applications in animation, gaming and robotics. At the Bristol Robotics Laboratory, based at the University of Western England, where I also work, a similar arrangement is used to design systems that let a robot understand human gestures meaning 'pass me that spanner' and so on.

What is next for you?

I have been helping to expand the capabilities of 'The Gloves' to allow Imogen to write and perform a song from scratch. The system made its debut for the release this year of the song MeTheMachine, as part of Imogen's Heapsongs series. I am working on machine-learning techniques for synthesis matching and gesture identification and looking at developing the gloves commercially. I've started working on the sonification of chemical reactions and quantum-mechanics simulations, and am collaborating with x-io Technologies in Bristol to build a commercial interface for multimedia interaction.

Is there a future for old-fashioned musical instruments?

Certainly. They are being augmented with sensors that exploit the musicians' 'spare bandwidth' — communicative gestures that don't produce sound. Dan Overholt's Overtone Violin uses sonar and glove-mounted accelerometers, among other sensors. To paraphrase David Wessel, a professor of music at the University of California, Berkeley, instruments should reward quickly, but allow scope for virtuosity.

Additional data