What if you could control the landscape in front of you by making music either beautiful or menacing? This is the inspiration behind Arboration, a project by ITP students Mike Allison, Michelle Cortese and FangYu Yang.
From their site:
The idea behind this project stems from the desire to combine musical improvisation with dynamic narrative control. The physical action of playing the piano-like, touch-sensitive keyboard is translated via harmonic analysis into data that controls a 3D environment projected onto a screen. By analyzing the intervals between the notes being played by the performer we can determine if what is being played is harmonically consonant or dissonant which is sent through the programming to determine the visual output. This process allows music theory to be the core of the control structure, however musical form is not a factor allowing anyone to have the full experience, not just musicians. Music theory is the control, play is the vehicle, and visual/emotional response is the feedback system.
The interface is made of laser-etched plywood, with custom-cut copper panels arranged like a keyboard and used as capacitive touch sensors. As consonant intervals are played, trees spring up on the screen, the sun shines, and a rainbow appears. If the user’s melody becomes more dissonant, the trees shrivel, and the forest is set aflame.
The capacitive data is read by an Arduino and sent to Processing, OSC, MAX/MSP, Ableton, and finally Unity3D.
ADVERTISEMENT