The natural world has always been a rich source of inspiration for music. Vivaldi's Four Seasons and Beethoven's Pastoral Symphony immediately come to mind as two examples of classical music inspired by nature. And of course there is Holst's The Planets, which is one of the most celebrated examples of music inspired by the Solar System that I can think of. The ancient Greek philosophical maxim, that astronomy is for the eyes what music is for the ears, still inspires composers today. Indeed, a plethora of approaches to composing music that is inspired by natural science has emerged since The Planets was composed a century ago, and there are many works inspired by physics in particular.
The emergence of powerful computing technology enabling the manipulation of large volumes of data, combined with the development of sophisticated modelling and simulation technology, allowed composers to develop approaches to musical composition that are more objectively informed by science rather than merely inspired by it. The Computer Music community has been very active in developing algorithms to render data into sound for both scientific and artistic purposes. For instance, in my group at Plymouth University's Interdisciplinary Centre for Computer Music Research (ICCMR), composer Nuria Bonet Filella, developed algorithms to synthesize sounds with data from simulations of dark matter to create the installation Sonification of Dark Matter, premiered at the Peninsula Arts Contemporary Music Festival 2016, in Plymouth, UK. And in 2013 Alexis Kirke stormed the California Academy of Sciences in San Francisco with Cloud Chamber, a violin piece that goes well beyond taking inspiration from particle physics. The violinist interacts with cosmic ray particles on stage in real time. A camera above the cloud chamber records the particle tracks, which are synthetized to sounds. The system developed at ICCMR enables a violin to create an electrical field that directly affects the charged particles in the chamber. This enabled a real-time mutual musical interaction between the violinist and the particles.
But the work that I would like to highlight here is a real-time sonification platform for ATLAS experimental data, developed by the MIT Media Lab in collaboration with CERN. A prototype of this fascinating work was demonstrated at a workshop held in July 2015 at the Montreux Jazz Festival, in Switzerland, and it has been evolving ever since. In a nutshell, they developed a system that sonifies real-time particle collision data from CERN's ATLAS detector.
Sonification is generally defined as the use of non-speech audio to convey information from the given data: an auditory equivalent of data visualization. Sonification is often used to complement visualization in situations where a multitude of data need to be simultaneously monitored. For instance, the auditory rendering of heartbeat in an operating theatre relays this information to the surgeons' ears while their eyes concentrate on the job in hand.
ATLAS is one of the Large Hadron Collider (LHC) detectors used to search for new physics such as the Higgs boson or dark matter. Beams of extremely high-energy particles collide and form new particles that are scattered in all directions. The ATLAS detector collects collision data such as the trajectory and energy of the charged collision debris. The sonification system takes a small subset of the collision data and relays them through algorithms designed to render them as sequences of musical notes. What is interesting here is that the rendering is done online; that is, sounds are generated as CERN's experiments take place. ATLAS data are relayed through the Internet and composers have the option to either use the sonification methods designed by the developers or use the data to drive their own custom-built music software.
LHC collisions produce an extremely large amount of data. The design of a sonification system that is able to convey these data in meaningful ways is not a trivial task. Hence, using a small subset of the ATLAS data to begin with is a sensible strategy to get musicians thinking about how one could listen to the fiendishly complicated phenomena that are occurring inside the collider. And one can take a more creative approach. The fact that users might use the data to drive their own custom-built software is a particularly welcome feature of this platform. This allows for a practice that I refer to as 'musification'. It gives composers the opportunity to fully engage with the data in creative ways. My pet theory is that a musification approach to render big data sonically might reveal properties and behaviours that would probably not be revealed with parametric sonification. But of course, one needs to understand the phenomena in question in order to engage with the data meaningfully, otherwise one might as well use a random number generator to produce musical notes.
The developers have recently launched a website (http://quantizer.media.mit.edu) to introduce the project, with a few examples of music generated on the fly. The platform will be more broadly available in the near future, but at present composers interested in having a go should contact the developers to discuss how they might proceed.
I must confess that I found the examples on the project's website a little tedious. But this is just my humble opinion. Make no mistake, this project is a fantastic initiative to increase public understanding and foster engagement with high-energy physics through music. And I feel inspired to use it to compose a tribute to Holst: The Particles. May I have access please?
Reviewed by Eduardo Reck Miranda Eduardo Reck Miranda is a composer and Professor of Computer Music at Plymouth University, UK. e-mail: firstname.lastname@example.org