Skinput: projecting a UI onto your own body

Wearables

It’s hard to give Carnegie Mellon PhD student Chris Harrison’s Skinput a fair shake without automatically assuming it’s a variant of Pravan Mistry’s Sixth Sense project. Nevertheless, it does employ different tech: Mistry’s Sixth Sense is optical; Skinput uses a “novel, non-invasive, wearable bio-acoustic sensor” to track your gestures.

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as a finger input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always-available, naturally-portable, and on-body interactive surface. To illustrate the potential of our approach, we developed several proof-of-concept applications on top of our sensing and classification system.

Created with Desney Tan and Dan Morris of Microsoft Research. Harrison will formally present the project at CHI2010 this April.

[via Core77]

What will the next generation of Make: look like? We’re inviting you to shape the future by investing in Make:. By becoming an investor, you help decide what’s next. The future of Make: is in your hands. Learn More.

Tagged

My interests include writing, electronics, RPGs, scifi, hackers & hackerspaces, 3D printing, building sets & toys. @johnbaichtal nerdage.net

View more articles by John Baichtal
Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK