Making gestural music with the iPhone

Computers & Mobile

My friend, experimental musician, media artist, and director of the Culture Lab at Newscastle University, Atau Tanaka, has tweaked up some iPhones to transform them into gestural musical devices. Here he performs with Adam Parkinson:

In a duo, with one in each hand, they create a chamber music, 4-hands iPhone. The accelerometers which typically serve as tilt sensors to rotate photos in fact allow high precision capture of the performer’s free space gestures. The multitouch screen, otherwise used for scrolling and pinch-zooming text, becomes a reconfigurable graphic user interface akin to the JazzMutant Lemur, with programmable faders, buttons, and 2D controllers that control synthesis parameters in real time. All this drives open source Pure Data (PD) patches running out of the free RJDJ iPhone app. A single advanced granular synthesis patch becomes the process by which a battery of sounds from the natural world are stretched, frozen, scattered, and restitched. The fact that all system components – sensor input, signal processing and sound synthesis, and audio output, are embodied in a single device make it very different than the typical controller + laptop model for digital music performance.

Atau and Adam

Discuss this article with the rest of the community on our Discord server!
Tagged

Gareth Branwyn is a freelance writer and the former Editorial Director of Maker Media. He is the author or editor of over a dozen books on technology, DIY, and geek culture. He is currently a contributor to Boing Boing, Wink Books, and Wink Fun. His free weekly-ish maker tips newsletter can be found at garstipsandtools.com.

View more articles by Gareth Branwyn

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK