My friend, experimental musician, media artist, and director of the Culture Lab at Newscastle University, Atau Tanaka, has tweaked up some iPhones to transform them into gestural musical devices. Here he performs with Adam Parkinson:
In a duo, with one in each hand, they create a chamber music, 4-hands iPhone. The accelerometers which typically serve as tilt sensors to rotate photos in fact allow high precision capture of the performer’s free space gestures. The multitouch screen, otherwise used for scrolling and pinch-zooming text, becomes a reconfigurable graphic user interface akin to the JazzMutant Lemur, with programmable faders, buttons, and 2D controllers that control synthesis parameters in real time. All this drives open source Pure Data (PD) patches running out of the free RJDJ iPhone app. A single advanced granular synthesis patch becomes the process by which a battery of sounds from the natural world are stretched, frozen, scattered, and restitched. The fact that all system components – sensor input, signal processing and sound synthesis, and audio output, are embodied in a single device make it very different than the typical controller + laptop model for digital music performance.