Hacking on the Frontier of Gestural Input

Computers & Mobile
Hacking on the Frontier of Gestural Input

jolt-banner

leap-motion-pause

When the Microsoft Kinect game controller came onto the scene in 2010, the maker community immediately saw its potential as an input device for 3D scanning, gestural input, and spacial recognition. Initially, Microsoft didn’t like the idea of people hacking their product, but it didn’t stop the most eager among us from successfully cracking the device’s protocol so that it could be used in homebrew projects. Microsoft eventually backpedaled on their position and even signalled their acceptance of the idea by releasing their own SDK. Now it’s a frequently used tool in the maker arsenal and we still see creative and inventive projects that use the Kinect.

Kinect-croppedIf you’re familiar with the specifics of the Kinect, you know that it uses some sophisticated hardware and software to create depth maps of a space, and then optionally takes that information to track human bodies within that space. It allows you to use your entire body to control whatever you want, be it a robot, mouse pointer, a digital puppet, and so on. And while it is possible to use the Kinect to track the position of each of your fingers, it’s much better suited for whole-body tracking. “We’ve seen very few 3D sensing tools that focus on the details of one’s hands,” said Robbie Tilton, a user interface prototyper. “Transitioning from thinking about larger gestural movements to smaller and potentially more intricate hand movements definitely comes with a new set of ideas.”

Robbie and other developers are just now getting access to a new product that brings hand tracking to a whole new level. The Leap Motion Controller was first announced in a video last May and it immediately went viral.

They want “people to
go far and wide and
do whatever they
can with it.”

Since then, they’ve been allowing third-party developers—including hackers and hobbyists—to try out the hardware and work with their SDK, taking a decidedly different tack than Microsoft initially took with the Kinect. At CES, I was able to take a closer look at the device and hear what the company thought of makers working with their product. Michael Zagorsek, the company’s Vice President of Product Marketing, said that they want “people to go far and wide and do whatever they can with it.”

The company offered me a unit to try out myself and gave me access to their developer portal. The SDK download includes the software drivers, libraries, a test application, and sample code for a few different programming languages including Java, C++, and Python. After connecting the controller via USB and launching the test application, it took a few adjustments before I had it working reliably. I didn’t realize how wide the field of view of the Leap Motion is, so my chin was being picked up by the sensor. According to my developer contact at Leap Motion, it’s likely that a differentiation can be made between fingers and chins within the software in the future. Right now, for instance, it can differentiate between fingers and tools like pens.

It was fairly easy to get the Python sample code running and get values for the number of hands and fingers in the view of the Leap Motion Controller. It can also get angle of the hands and the position of the fingers in a 3D space. I haven’t taken things much further than the example code, but I’m eager to continue to work with the technology.

YouTube player

Despite the fact that only a limited group of people have been given access to this powerful new tool, there’s still quite a few impressive examples of what’s being done with the Leap Motion. Earlier this month, we posted about Stephane Bersot, who used his controller to alter the sound of his digital musical instruments, as demonstrated in the video above. A Leap Motion orchestra formed during San Francisco Music Hack Day 2013, which concluded with a live performance of the instruments that were created within 24 hours. Of course, there’s more to the device than just making instruments. For example, a group of students from The University of Pennsylvania started working on a project which uses the controller to translate sign language into text.

“Think, for example, of a small
device comprised of a Leap and
a Raspberry Pi allowing users to
control objects in physical space
with no visible electronics.
It would be magical!”

I contacted Scott Garner, a creative technologist who has been experimenting with the Leap Motion for his own work. One of his projects is a web-based marionette that the user can control with his or her fingers. “Because of the nature of the device and the way it has been presented, most of my initial ideas were for screen-based interactions,” he said. “I think there are other opportunities, though, for headless operations in which gestures cause results in the real world. Think, for example, of a small device comprised of a Leap and a Raspberry Pi allowing users to control objects in physical space with no visible electronics. It would be magical!”

With all the excitement around hand tracking, it’s no surprise that Leap Motion isn’t the only company getting into the game. Code Laboratories in Las Vegas revealed a product called Duo, which is similar to the Leap Motion Controller.Duo One main difference is their plan to offer the device as a DIY kit. It’s not yet available, but they’re currently in the process of launching a Kickstarter. According to their site, “a successful Kickstarter project will allow us to provide the DUO as a DIY kit that you can build on your own and customize it in any way you choose.”

And while the Leap Motion isn’t yet available on store shelves, the company is accepting developer applications, which gets you access to the SDK and a developer unit to experiment with. Their developer program is not limited professionals, so if you have an idea for how you’d use the Leap Motion, it’s worth filling out an application. The company has also been hinting at an upcoming announcement regarding a ship date for consumers who are interested in buying the product. Whether you already have a Leap Motion or are eager to experiment with one, we’d like to know how you’d use hand tracking in your projects. Leave a comment below with your ideas.

20 thoughts on “Hacking on the Frontier of Gestural Input

  1. Paul Meeks says:

    So many possibility’s here. Maybe inter connected with Dragon ???

  2. Brent Hannah says:

    Wet Blanket here. With all the ridiculous patents out there for stuff like a rectangle shape for a tablet etc. Are the patent trolls just waiting to pounce on the first successful hand gesture device?

  3. Mageca Labs (@MagecaLabs) says:

    Very good article! Well done

  4. Mike M says:

    Paul, when you say “Dragon”, you mean voice regonition? You do know that a gesture plus voice recognition camera has been introduced by Intel/Creative Labs called the Perceptual Computing Camera? It is like a mini-Kinect (full 3D), so (for example) the sign language gesturing on it is quite better than Leap: http://www.youtube.com/watch?v=T9Fp1PanPXs

    1. Paul says:

      :) No! I did not know that, but I do now… Thanks man!

  5. Omek Interactive says:

    Interesting article on recent developments in gesture!

    Omek Interactive is another player in the emerging gesture recognition field (founded in 2006). Our Grasp SDK is specifically designed for close-range hand & finger gesture recognition and motion sensing. We are gearing up to release a beta version of our solution, which offers a full skeleton model of the hand. In the meantime, we’re writing on our blog about how to create compelling user experiences for gesture-based interfaces. You can check it out here: http://omekinteractive.com/blog

  6. flared0ne says:

    Oddly enough, the “Duo” is highly unlikely to get “Kickstarter” to accept their proposal, since Kickstarter has restrictions against violating patents and copyrights, which the people behind the “Duo” have done and are doing. In particular, by basically copying wholesale the original Leap Motion website, up to and including the demo video. Their attempt to duplicate the Leap Motion technology, however, shows no sign of including “the secret sauce”, the intellectual property which is the reason Leap Motion has received millions in investments already. What Code Laboratories is offering to hackers and DIY’ers and open-source advocates is “hope”; they do apparently understand social engineering.

    1. N. says:

      flaredOne: Do you really think making serious but unsubstantiated accusations as a blog comment is smart?

      Are you associated with Leap? If not, how do you do know what copyrights of Leap the Duo people have or haven’t infringed? And if you are associated with Leap, does your employer know that you are making potentially libelous statements? Do *you* know that you are making potentially libelous statements?

      Your comment about “secret sauce” is also interesting. If Duo doesn’t have the “secret sauce” that’s key to all this, then what’s the problem? Unless the Leap secret sauce had a “Best By” date and they kept it in the cupboard until it, apparently, spoiled?

      I have no axe to grind in this situation (except that I’m happy that innovative alternative input methods are becoming available) but your statements here seem, at best, misguided, and at worst idiotic.

  7. flared0ne says:

    All cliche “handwaving” aside, THE applications which are going to nail down the viability of gestural interfaces are the ones which insurance companies are going to start endorsing as “covered by your insurance plan”, as “medically necessary” for people suffering from repetitive stress injuries, carpal tunnel syndrome, etc. These are going to include the ability to fully control a personal computer without lifting off the wrist-rests — changing finger extension without moving the wrists, lifting the finger tips above the keyboard into an easily recognizeable position triggering evaluation of “mouse control inputs”. The need to reach away from the keyboard to grab a mouse is about to join the passenger pigeon and the dodo.

  8. Fred P says:

    Btw, the DUO has officially launched their Kickstarter campaign (http://www.kickstarter.com/projects/codelabs/duo-the-worlds-first-diy-3d-sensor/dashboard) today and they already raised 13% of their goal… Way to go DUO team!!

Comments are closed.

Discuss this article with the rest of the community on our Discord server!
Tagged

Matt Richardson is a San Francisco-based creative technologist and Contributing Editor at MAKE. He’s the co-author of Getting Started with Raspberry Pi and the author of Getting Started with BeagleBone.

View more articles by Matt Richardson

ADVERTISEMENT

Maker Faire Bay Area 2023 - Mare Island, CA

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 15th iteration!

Buy Tickets today! SAVE 15% and lock-in your preferred date(s).

FEEDBACK