Mike Rotondo created an instrument using an Arduino, Max/MSP and some sensors for a Physical Interaction Design workshop at Stanford’s CCRMA. As he puts it: “When the guitar is too sexy and the piano makes too much sense, it’s time to stick a bunch of sensors to your body with electrical tape and plug it all into your computer!”
How it works:
There’s a small electret microphone attached to my foot, and two bend sensors on my arm: one in my elbow, and one on my finger. The signals from each of these are routed through an Arduino microcontroller into a Max/MSP patch.
The microphone output is routed through a percussion follower, and impulses (like stomps) trigger the instrument’s tone generation. The sound of the instrument is created by a a plucked string model and some ADSR’d harmonized sine waves. If the impulse picked up by the mic is heavy in high frequencies, the sound is captured and “granulated” around a bit at a random interval.