Sonification - Finding Music in Science

June 21, 2015

What is the Music of the Spheres? Most of us probably remember Pythagoras because of geometry class and a rule about right triangles. But there was much more to him than that. For him, principles of triangles were just a single note in the grand musical work that was the cosmos, wherein ratios, harmony, and the heavenly bodies were united voices of the cosmic song. 
Contemporary cosmologists seem to agree. Barnard College cosmologist Janna Levin describes gravitational waves from large-scale cosmic events as "playing space like a drum."  Birmingham astrophysicist Yvonne Elsworth describes the behavior of distant stars as being akin to a musical instrument, the resonances of which give evidence of exoplanets. Nobel laureate George Smoot describes the nature of cosmic microwave background radiation as being similar to the vibrations of a percussion instrument, which start as a chaotic transient and over time settle into a more regular pattern of vibration.  Even without doing a formal, definitive study on the matter, an interested science reader might easily get the impression that it is not too far afield to consider the universe as operating something like a musical instrument. 
Every grade school student now studying under the Common Core [1, 2] now knows what visualization or infographics is. You take large sets of data and present them in graphs and charts to make the vast quantities of numbers easier to understand. As the Theory of Multiple Intelligences posits, people learn in various modes [3]. Therefore why not auralize information, i.e. do the same thing, but for the ear? 
Psychologists such as Albert Bregman and Diana Deutsch describe the degree to which the auditory system informs us about our environment, in some cases better than the eyes can [4, 5]. Since the 1980s, government agencies such as NASA and the Navy have researched ways of leveraging the hearing capacities of pilots to free up their eyes for other tasks. 
Auditory display refers to the study of auditory cues to effectively convey information [6]. The type we take interest in is an intersection of music and informatics called sonification. This refers to mapping data to sound characteristics such as pitch, timbre, or stereo localization. 
Sometimes sonification is done out of necessity. Software has been created for blind researchers to create sonified versions of astronomical graphs so that they can study them. But sighted colleagues also use it, as they find that some patterns are easier to hear than to see. Adding sound can help us to see things better. Our intuitive understanding of information is strengthened when both images and sounds are associated with it. Adding sound makes it more “alive.”  To paraphrase George Lucas, “Sound is half the picture” [7].
So how does one go about creating sonifications? Some approaches are literal, others are more symbolic. Some datasets, particularly long ones consisting of a single stream of numbers, can be simply transformed to audio files, sometimes with transposition and filtering added. The sonic results tend to resemble various flavors of filtered noise, something like listening to the ocean in a shell, with occasional patterns and anomalies that indicate some identifiable phenomena.
Symbolic renderings create other perspectives. If the data points are treated symbolically, for example as pitches, sometimes we are better able to "magnify" what we are listening to. The contours of a visual graph become a melody, and we can stretch its range and adjust its tempo and duration to suit our needs.  Thus, an illustrative sonification of the helioseismology graph might sound like this
A more analytical example is a resynthesized pulsar signal. Pulsar datasets describe amplitude changes of various electromagnetic frequencies. By transposing these light frequencies to proportionally related sound frequencies and apply the amplitude changes of the data to them, we get a sound that is rhythmic, and a signature “chord” that is present in the data. 
Finally, the underlying rumble of everything, cosmic microwave background radiation, is fascinating to hear as a literal radio wave. This was how it was discovered. However, the radiation is typically studied in a spectral format. By transforming the spectrum to a sonification in which intensity values are remapped as pitches, we get a melody that unfolds in the same pattern that the spectral plot has when read from left to right.
Humans are wired to respond to music. Different people may like different music, but you won’t find anyone who doesn’t like some music. A number of writers, such as evolutionary biologist E.O. Wilson and neuroscientist Daniel J. Levitin, have speculated on the evolutionary role of music as a survival adaptation [7, 8]. We respond to it so strongly, so instinctually, that it seems likely that it’s related to some survival mechanism that an be traced back some tens of thousands of years ago. Given the fundamental role music plays in human life, the sciences can only gain from harnessing its power (if it’s done right), both for engagement and for further insight.  
Adapted from a Huffington Post piece by Mark Ballora & George Smoot
Mark Ballora’s TEDxPSU talk 
Janna Levin’s TED talk 
Honor Harger’s TED talk 
Robert Alexander’s work 
[1] Common Core
[2] Donna M. Wong. The Wall Street Journal Guide to Information Graphics: The Dos and Don'ts of Presenting Data, Facts, and Figures. W. W. Norton & Company, 2013.
[4] Albert S. Bregman., Auditory Scene Analysis. MIT Press, 1994.
[5] Diana Deutsch, The Psychology of Music, Third Edition. Academic Press, 2012.
[6] Durand R. Begault, "Head-Up Auditory Displays for Traffic Collision Avoidance System Advisories: A Preliminary Investigation." San Jose State University Foundation, NASA-Ames Research Center, Moffett Field, California.
[8] Daniel J. Levitin, This Is Your Brain on Music: The Science of a Human Obsession. Plume/Peguin, 2007.
[9] E. O. Wilson, The Social Conquest of Earth. Liveright, 2013.