Making gestural music with the iPhone

Gareth Branwyn

Gareth Branwyn is a freelance writer and the former Editorial Director of Maker Media. He is the author or editor of over a dozen books on technology, DIY, and geek culture. He is currently a contributor to Boing Boing, Wink Books, and Wink Fun. And he has a new best-of writing collection and "lazy person's memoir," called Borg Like Me.

3977 Articles

By Gareth Branwyn

Gareth Branwyn is a freelance writer and the former Editorial Director of Maker Media. He is the author or editor of over a dozen books on technology, DIY, and geek culture. He is currently a contributor to Boing Boing, Wink Books, and Wink Fun. And he has a new best-of writing collection and "lazy person's memoir," called Borg Like Me.

3977 Articles

My friend, experimental musician, media artist, and director of the Culture Lab at Newscastle University, Atau Tanaka, has tweaked up some iPhones to transform them into gestural musical devices. Here he performs with Adam Parkinson:

In a duo, with one in each hand, they create a chamber music, 4-hands iPhone. The accelerometers which typically serve as tilt sensors to rotate photos in fact allow high precision capture of the performer’s free space gestures. The multitouch screen, otherwise used for scrolling and pinch-zooming text, becomes a reconfigurable graphic user interface akin to the JazzMutant Lemur, with programmable faders, buttons, and 2D controllers that control synthesis parameters in real time. All this drives open source Pure Data (PD) patches running out of the free RJDJ iPhone app. A single advanced granular synthesis patch becomes the process by which a battery of sounds from the natural world are stretched, frozen, scattered, and restitched. The fact that all system components – sensor input, signal processing and sound synthesis, and audio output, are embodied in a single device make it very different than the typical controller + laptop model for digital music performance.

Atau and Adam