Skinput: projecting a UI onto your own body


It’s hard to give Carnegie Mellon PhD student Chris Harrison’s Skinput a fair shake without automatically assuming it’s a variant of Pravan Mistry’s Sixth Sense project. Nevertheless, it does employ different tech: Mistry’s Sixth Sense is optical; Skinput uses a “novel, non-invasive, wearable bio-acoustic sensor” to track your gestures.

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as a finger input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always-available, naturally-portable, and on-body interactive surface. To illustrate the potential of our approach, we developed several proof-of-concept applications on top of our sensing and classification system.

Created with Desney Tan and Dan Morris of Microsoft Research. Harrison will formally present the project at CHI2010 this April.

[via Core77]

Discuss this article with the rest of the community on our Discord server!

My interests include writing, electronics, RPGs, scifi, hackers & hackerspaces, 3D printing, building sets & toys. @johnbaichtal

View more articles by John Baichtal