“If music is inherent in all things and places, then why not let music play itself? The composer, in the traditional sense, might no longer be necessary. Let the planets and spheres spin.” – David Byrne
As science becomes further entrenched in the era of Big Data, scientists are facing the increasing challenges of how to organize large data sets generated by computation and automation in useful ways. Visualization is the standard method of representing scientific information and often yields visually striking images that can help reveal patterns in the data.
Recently, however, scientists have been tinkering with different ways to experience scientific data–namely, through hearing it. Scientists, often in collaboration with artists, have used these large data sets to generate music. Here’s a recent sampling of science, musically-sonified:
“The English Channel project involves sequencing DNA found in the seawater and trying to piece together a sense of how some of these microbial systems work. How do thevarious organisms interact with one another? How do they respond to changing conditions like temperature, nutrients, acidity? The research generates terabytes upon terabytes of data.
To turn some of it into music, Larsen mapped environmental conditions–daylight, temperature, phosphorous level–to specific chords. When the conditions change, the chords change. Then he took the microbial concentrations at each of those environmental conditions—how much of a certain type of microbe exists at a certain temperature, say—and mapped each one to a scale. The chords play in a particular scale, depending on how the environmental conditions affect the size of the microbe communities.” (read more)
‘Vicinanza led the Higgs sonification project collaborating with Mariapaola Sorrentino of ASTRA Project (Cambridge), who contributed to the sonification process, and Giuseppe La Rocca (INFN Catania), who was in charge of the computing framework.
“Sonification worked by attaching a musical note to each data. So, when you hear the resulting melody you really are hearing the data,” Vicinanza said.
The researchers mapped intervals between values in the original data set to interval between notes in the melody. The same numerical value was associated to the same note. As the values increased or decreased, the pitch of the notes grew or diminished accordingly.’ (read more)
“Published Thursday on the Fermi blog, this cosmic concert came from GRB-080916C, an especially strong gamma-ray burst that NASA recorded in September 2008. To make it audible, Fermi researchers converted the GRB’s energy signature into musical notes played on a harp, a cello and a piano. They also made an animation of its photon frequency, and then paired these sights and sounds” (read more)