Using a video camera and microphones, our custom computer software converts the activity of a space into a continually changing soundscape whose form and content reflect the visual and aural phenomena observed in the area. Every detected change in the physical space impacts the piece in some way: sometimes generating audio material and sometimes creating structural variation. The software translates the trends of color, density and motion from the video camera into the timing, pitch, and spatial characteristics of an audio environment. In some versions of the installation there are microphones which allow ambient sounds to become part of the material used by the instrument. The result is broadcast through 4 speakers in the space.
The Observational Soundscape I
Memorial Union, University of Maine
The first version was installed in May 2005. The atrium of the Memorial Union bloomed with sound as people passed through the installation area while a computer screen showed a visualization of the movement sensed by the piece. We are grateful for the support of the Maine Arts Commission in the construction of this piece.
The Observational Soundscape II
Cooper Union School of Art, NYC – September 2005
The second version was installed in a sunlit hallway at the Cooper Union School of Art in NYC. This version was much more sensitive to the colors of moving objects and it featured a different visual display which showed the cumulative color trails of all of the people who walked through the space. These trails were constantly analyzed to create long-term variation in the soundscape itself. This version was sensitive enough to react to the occasional rays of sun projected on the floor, responding with quiet unexpected compositions.
The Observational Soundscape III
Olin Arts Center, Bates College
The third version of the installation was part of an exhibition called Activator at the Olin Arts Center, Bates College Maine. We rebuilt the synthesis engine to incorporate the aural history of the space (in much the same way that the color trails of the previous version recorded visual history). Whenever nearby sounds reached a certain threshold they were recorded into a digital archive where they stayed for up to seven days. When the camera saw movement, the software picked a sound from the archive and subjected it to processing and spatialization based on the color and density of the motion. The color of the current object was compared to the historical average to determine which sound was picked. Colors that broke the continuity of their surroundings triggered sounds from farther back in time, mirroring visual discontinuities with temporal ones.
The Observational Soundscape IV
Indiana University – Perform.Media Festival – September 2006