What if you could control the landscape in front of you by making music either beautiful or menacing? This is the inspiration behind Arboration, a project by ITP students Mike Allison, Michelle Cortese and FangYu Yang.

From their site:

The idea behind this project stems from the desire to combine musical improvisation with dynamic narrative control. The physical action of playing the piano-like, touch-sensitive keyboard is translated via harmonic analysis into data that controls a 3D environment projected onto a screen. By analyzing the intervals between the notes being played by the performer we can determine if what is being played is harmonically consonant or dissonant which is sent through the programming to determine the visual output. This process allows music theory to be the core of the control structure, however musical form is not a factor allowing anyone to have the full experience, not just musicians. Music theory is the control, play is the vehicle, and visual/emotional response is the feedback system.

product Control a Virtual World Using Music

The interface is made of laser-etched plywood, with custom-cut copper panels arranged like a keyboard and used as capacitive touch sensors. As consonant intervals are played, trees spring up on the screen, the sun shines, and a rainbow appears. If the user’s melody becomes more dissonant, the trees shrivel, and the forest is set aflame.
product back Control a Virtual World Using Music

The capacitive data is read by an Arduino and sent to Processing, OSC, MAX/MSP, Ableton, and finally Unity3D.

Michael Colombo

In addition to being an online editor for MAKE Magazine, Michael Colombo works in fabrication, electronics, sound design, music production and performance (Yes. All that.) In the past he has also been a childrens’ educator and entertainer, and holds a Masters degree from NYU’s Interactive Telecommunications Program.


Related
blog comments powered by Disqus
Follow

Get every new post delivered to your Inbox.

Join 28,393 other followers