Creative and Performing Arts Grant Report Project Title: Principle Investigator:

Creative and Performing Arts Grant Report
Project Title:
Music, Physics and Sonification
Principle Investigator:
Reginald Bain, Professor
Composition and Theory
School of Music
Grant Period:
May 16, 2011 - May 15, 2013
Foremost, I would like to thank the Provost’s Office and faculty selection committee for
their support of my work. This grant allowed me to explore a new avenue of creative
research: data-driven musical sonification. With the generous assistance of Dr. Milind
Purohit of USC’s High Energy Physics Group, I was able to obtain access to data from
the ATLAS detector of CERN’s Large Hadron Collider.1 The focus of this project was
the creation of an original musical composition from the ATLAS detector data. I created
a five-movement, computer-generated, algorithmic composition titled Sonification
Studies (2012-13) that was premiered at USC on April 2, 2013. In addition to the world
premiere of Sonification Studies, the outcomes of the project thus far include a public
grant talk at USC on April 5, 2013, and a journal article (Bain 2012) on related work in
the area of musical sonification made possible by the equipment purchased with the
grant. This is just the beginning of my work in this area. Sonification Studies, and my
other physics-themed algorithmic composition Retreat from Quiescence, will serve as the
cornerstones of my next electronic music CD project. My compositional experiments
with the data continue as I attempt to expand the software I created for Sonification
Studies to work with new data sets and with 5.1 surround sound projection. I am also in
the process of writing up my results for future publication. What is more, I expect this
area of research to be of great interest to the undergraduate and graduate computer music
students who work under my supervision. Most of the software purchased for the project
has application beyond sonification, so this grant has also made it possible for me to offer
a wider variety of research opportunities to my undergraduate and graduate computer
music students. Again, thank you for your support of our work.
Enclosed you will find: (1) a concert program for the world premiere of Sonification
Studies; (2) excerpts from the handout I prepared for my public talk including a selected
bibliography for the project; (3) a technical note describing my compositional and
computational approach in the first movement of Sonification Studies, “When Particles
Collide,” that includes a link where the reader can listen to the first movement online.
Bain, Reginald. 2012. “Risset's Arpeggio: Composing Sound using Spectral Scans.”
Csound Journal (Vol. 17). Available online at:
CERN is located near Geneva, Switzerland. For more information visit
School of Music
The Experimental Music Studio (xMUSE)
USC Comp uter Music Conce rt
Reg inald Bain, direclor
Games Composers PIqy
Tuesday, April 2, 2013
7:30 PM • Recital H a ll
Matthew Fink
Endless Seplember (2013)
for alto saxophone and electronic sound
Andrew J. Allen, 0110 saxophone
David Moody
Imll/unity (20 12)
for electronic media
Ari Lindenbaum
Phone Call (2012)
for electronic media
Hornshifi (20 13)
Luk e Nem it z
for horn and elec tronic sound
Rach el Rom ero, horn
ChipQueSI (2013)
Zachary Cotton
electronic media emulating cla ss ic game soundch ips
1. Fresh Batteri es
11. World Maps Not To Scale
JlI. Final FOfm
I l1n1fnllll SOLffi-lO\ROLlNA===il
School of Music
Olle Shift 7i,'o Shift. lied Shift Bille Shift (2013)
for electronic media
David "C lay" Meltens
Apophenia (20 13)
Chris Johnson
a real-t ime int eract ive improvi sa ti on for W iimote ensemble
Zachary Cotton, Chris Johnson and Ari Lindenbaum , pel/orlllers
Reg inald Sa in
Sonificatioll Studies (20 12-2013 )
for electronic med ia
I. When Parlic les Collide
JI .
Quantum No ise
111. StTing Theory
IV. Tillle Warp
V. Radiant Energy
Go Forth (20 13)
for soprano and electro nic sound
I. You len Ille speech less
II. I dreamt
George Fetner
III . But now I have awoken
Rebecca Elizabeth Wood, soprano
Special thanks to Jeff Francis and his audio engineering stue/ellls
for the concert~' sound design
USC School of Music
Composition Seminar
Friday, April 5, 2013
2:30 - 4 p.m.
Music, Physics and Sonification
Reginald Bain, Professor
Composition and Theory
University of South Carolina
School of Music
When it comes to atoms, language can be used only as in poetry.
Niels Bohr
What we observe as material bodies and forces
are nothing but shapes and variations in the structure of space.
Erwin Schrödinger
The most beautiful thing we can experience is the mysterious.
It is the source of all true art and science.
Albert Einstein
Sonification Studies (2012-13)
for digital-signal processing computer
Reginald Bain
When Particles Collide (0:49)
Quantum Noise (3:16)
String Theory (2:17)
Time Warp (3:08)
Radiant Energy (0:43)
Music, Physics and Sonification handout, Page 1 of 4
Program Note
In the auditory display community, sonification is defined as the “use of non-speech audio to
convey information.” Data is “sonified” by mapping the numbers in a data stream to sonic
parameters so that variations in the data may be perceived. I use similar techniques in my
algorithmic music in order to create metaphysical marriages between scientific and musical
ideas. I use the computer to interactively explore the data, mapping number sequences to musical
parameters (such as pitch, intensity, duration, timbre, spatialization, etc.) in real time. I call this
compositional process musical sonification. Sonification Studies (2012-13) presents five brief
musical sonifications of high-energy particle physics data. Each movement is built from scratch,
that is to say, the sound itself is composed from the micro-level up using the following
elementary synthesis techniques:
Karplus-Strong physical modeling (of a plucked string),
Granular, and
Frequency modulation
synthesis. The data is from the ATLAS detector at CERN’s Large Hadron Collider (CREDIT:
Copyright CERN – ATLAS Collaboration). Special thanks to Dr. Milind Purohit of USC’s High
Energy Physics Group for making the data available, and to the Provost’s Creative and
Performing Arts Grant Program that supported the creation of this work.
Music, Physics and Sonification handout, Page 2 of 4
Selected Bibliography
Bain, Reginald. 1990. “Algorithmic Composition: Quantum Mechanics & the Musical Domain.” Proceedings of the
1990 International Computer Music Conference. San Francisco: International Computer Music Association:
____________. 2012. “Risset's Arpeggio: Composing Sound using Spectral Scans.” Csound Journal (Vol. 17).
Available online at:
Battier, Marc and Jean-Marc Jot, eds. 2006. Spatialisateur Reference Manual. Paris: IRCAM Forumnet. Available
online at: <>.
Berger, J. and O. Ben-Tal. 2004. “Creative Aspects of Sonification.” Leonardo 37(3): 229–232.
Barrass, Stephen, Mitchell Whitelaw and Freya Bailes. 2006. “Listening to the Mind Listening: An Analysis of
Sonification Reviews, Designs and Correspondences.” Leonardo Music Journal Vol. 16: 13-19.
Bregman. A.S. 1994. Auditory Scene Analysis: The Perceptual Organization of Sound. Cambridge, MA: MITPress.
Burk, Phil, Larry Polansky, Douglas Repetto, Mary Roberts and Dan Rockmore. 2011. Music and Computers: A
Theoretical and Historical Approach, Archival Version. Hannover, NH. Available online at:
Chowning, J. M. 1973. “The Synthesis of Complex Audio Spectra by Means of Frequency Modulation.” Journal of
the Audio Engineering Society, 21/7.
_____________. 1977. “The Simulation of Moving Sound Sources.” Computer Music Journal, 1/3 (June 1977): 4852.
Cipriani, Alessandro and Maurizio Giri. 2010. Electronic Music and Sound Design Theory and Practice with
Max/MSP. Contemponet.
Cowell, Henry. 1996. New Musical Resources. Cambridge University Press: New York.
Dodge, Charles and Thomas A. Jerse. 1997. Computer Music: Synthesis, Composition and Performance, 2nd ed.
New York: Cengage.
Fischman, Rajmil. “Clouds, Pyramids, and Diamonds: Applying Schrödinger's Equation to Granular Synthesis and
Compositional Structure.” Computer Music Journal, Vol. 27, No. 2 (Summer, 2003): 47-69.
Gabor, Dennis. 1947. “Acoustical Quanta and the Theory of Hearing.” Nature 159 (4044): 591-594.
Hermann, T., A. Hunt and J. G. Neuhoff, eds. 2011. The Sonification Handbook. Berlin: Logos Publishing House.
Also available online at: <>.
Holman, Tomlinson. 2000. 5.1 Surround Sound: Up and Running. Burlington, MA: Focal Press.
Jaffe, D. A. and J. O. Smith. 1983. “Extensions of the Karplus-Strong Plucked String Algorithm.” Computer Music
Journal 7/2 (MIT Press), 56–69.
Jedrzejewski, F. 2006. Mathematical Theory of Music. Paris: Ircam-Centre Pompidou.
Karplus, K. and A. Strong. 1983. “Digital Synthesis of Plucked String and Drum Timbres.” Computer Music
Journal 7/2 (MIT Press), 43–55.
Kramer, Gregory, ed. (1994). Auditory Display: Sonification, Audification, and Auditory Interfaces. Santa Fe
Institute Studies in the Sciences of Complexity. Proceedings Volume XVIII. Reading, MA: Addison-Wesley.
Nierhaus, Gerhad. 2010. Algorithmic Composition: Paradigms of Automated Music Generation. New York:
Springer, 2010.
Risset, J.C. Computer Music: Why? Available online at: <>.
Roads, C. 1988. “Introduction to Granular Synthesis.” Computer Music Journal, Vol. 12, No. 2 (Summer, 1988):
____________. 2002. Microsound. Cambridge, MA: MIT Press.
___________. 1996. The Computer Music Tutorial. Cambridge, MA: MIT Press.
Sturm, Bob L. 2000. “Sonification of Particle Systems via de Broglie’s Hypothesis.” Proceedings of the ICAD 2000
Conference. Available online at:
Whittle, Mark. 2004. “Big Bang Acoustics: Sound in the Early Universe.” The Newsletter of The Acoustical Society
of America. Volume 14, Number 4 (Fall 2004). Available online at:
Winkler, Todd. 2001. Composing Interactive Music: Techniques and Ideas Using Max. Cambridge, MA: MIT Press.
Wishart, Trevor. 1994. Audible Design. London: Orpheus The Pantomime Ltd.
Wolek, Nathan. 2003. “Granular Toolkit v1.0.” Journal SEAMUS (Volume XVI:2): 34-46.
Music, Physics and Sonification handout, Page 3 of 4
Selected Bibliography (cont.)
Xenakis, Iannis. 2001. Formalized Music: Thought and Mathematics in Composition, revised English edition.
Hillsdale, NY: Pendragon Press.
Bain, Reginald. 2010. Sounding Number. Centaur Records (CRC 3809).
Dodge, Charles. 1970. Earth’s Magnetic Field. Nonesuch (LP H-71250).
Nancarrow, Conlon. 1991. Studies for Player Piano. Wergo (WER 69072).
LHC - Large Hadron Collider -
ATLAS detector -
WMAP CMB: 9-year Microwave Sky -
Mark Whittle’s website -
LHC Sound’s The Sounds of Science -
Burke, et. al 2011, “Sonification & Charles Dodge’s Earth’s Magnetic Field (1970)”
Learn more about basic particle physics
The Particle Adventure -
Contemporary Physics Education Project -
Learn more about sonification
Hermann, Hunt and Neuhoff 2011, The Sonification Handbook -
Sonification Research
AlloSphere (UCSB, USA) -
Composers Desktop Project (CDP, U.K) -
Institut de Recherche et Coordination Acoustique/Musique (IRCAM, France) -
International Community for Auditory Display (ICAD) -
International Computer Music Conference (ICMC) -
Apple, Logic Pro -
Avid, ProTools -
Avid, Sibelius -
Avid, TL Space -
Composers Desktop Project, CDP -
Cycling ’74, Max/MSP -
DSP Quattro, DSP Quattro Pro -
IRCAM Forumnet -
Steinberg, Wavelab -
Trueman and DuBois, PeRcolate -
Wolek, Granular Toolkit -
Wolfram, Mathematica -
Music, Physics and Sonification handout, Page 4 of 4
Technical Note
I. When Particles Collide, from Sonification Studies (2012-13)
Reginald Bain
Professor of Composition and Theory
USC School of Music
Sonification Studies is a five-movement algorithmic composition, that is to say, the electronic
music is encoded in software as a set of algorithmic processes. The work is a set of data-driven
sonifications that use traditional digital synthesis techniques (in this case additive synthesis) to
create the music from the data in real time. An mp3 recording of the first movement “When
Particles Collide” (0:49) is available to the reader on the composer’s website at:
The composer used Cycling 74’s Max/MSP (Cycling ’74 2013)1 to realize the work. The toplevel of the composer’s Max/MSP program is shown in Fig. 1.
Fig. 1. The Max/MSP program that generates the first movement of Sonification Studies
Max/MSP is a high-level, graphical, object-oriented, multimedia and music programming language that generates high-quality
digital audio in real time.
“When Particles Collide” Tech. Note: Page 1 of 2
The very brief opening movement is a two-voice canon. A canon is a musical composition in
which one voice exactly imitates another after a certain time and pitch interval. The first voice is
called the leader. Subsequent voices are called followers. Sine waves serve as the “instruments”
in this movement because, from the point of view of Fourier analysis, sinusoids are metaphorical
“atoms of sound.” The first 15 events of the Supersymmetry (SUSY) working group file, which
was provided to the composer by Dr. Milind Purohit of USC High Energy Physics Group, are
read into the Max/MSP program at a rate of 10 lines per second in order to create the canon’s
leader. Electron Et values, the data-loaded variable (Ben-Tal and Berger 2004) in this particular
file, are mapped to the frequency values (pitches) presented by the leader. In order to put the
leader’s frequency values in an optimal tessitura for pitch, intensity and timbral perception, the
values are scaled (i.e., transposed) down four octaves. That is, the frequency of the canon’s
leader (𝑓! ) at any given time is the electron Et value divided by 16. The canon’s follower echoes
each note of the leader after 200 milliseconds (ms) using a slightly higher frequency: 𝑓! = !" 𝑓! .
This particular interval of imitation was chosen, not only because of its just-intoned quality, but
also because it lies within the critical band. This guarantees that an interference pattern, or
“beating” effect, that varies with frequency will always be the result–a metaphorical collision of
tones if you will. A trained musician might describe the first movement succinctly as: a twovoice sine wave canon at the just (5-limit) minor second after 200 ms.
The movement is also spatially conceived. IRCAM’s Spatialisateur, a virtual acoustics
processor, keeps the perceptual quality of the two sinusoids in a constant state of flux within the
stereo field so that the impression is suggestive of high-energy particle motions in an accelerator.
(This may not be as apparent in the compressed mp3 demo file available online.) Finally, the raw
Max/MSP stereo audio output is post-processed using Avid’s ProTools and TL Space.
The form of the first movement is created by cycling through the first 15 events (100 lines) of
data. A superimposed 5-second fade in at the beginning of the movement, and 15-second fade
out at the end, attempt to give the impression that the collisions have been going on long before
the listener appears on the scene and continue well beyond the movement’s end.
Data Credit
Copyright CERN – ATLAS Collaboration
Avid, ProTools -
Avid, TL Space -
Cycling ’74, Max/MSP -
IRCAM, Spatialisateur (Spat~) -
Battier, Marc and Jean-Marc Jot, eds. 2006. Spatialisateur Reference Manual. Paris: IRCAM
Berger, J. and O. Ben-Tal. 2004. “Creative Aspects of Sonification.” Leonardo 37(3): 229–232.
Cycling ’74. 2013. Max 6 Help and Documentation. Available online at:
“When Particles Collide” Tech. Note: Page 2 of 2