Hacking on the Frontier of Gestural Input

Computers & Mobile
Hacking on the Frontier of Gestural Input



When the Microsoft Kinect game controller came onto the scene in 2010, the maker community immediately saw its potential as an input device for 3D scanning, gestural input, and spacial recognition. Initially, Microsoft didn’t like the idea of people hacking their product, but it didn’t stop the most eager among us from successfully cracking the device’s protocol so that it could be used in homebrew projects. Microsoft eventually backpedaled on their position and even signalled their acceptance of the idea by releasing their own SDK. Now it’s a frequently used tool in the maker arsenal and we still see creative and inventive projects that use the Kinect.

Kinect-croppedIf you’re familiar with the specifics of the Kinect, you know that it uses some sophisticated hardware and software to create depth maps of a space, and then optionally takes that information to track human bodies within that space. It allows you to use your entire body to control whatever you want, be it a robot, mouse pointer, a digital puppet, and so on. And while it is possible to use the Kinect to track the position of each of your fingers, it’s much better suited for whole-body tracking. “We’ve seen very few 3D sensing tools that focus on the details of one’s hands,” said Robbie Tilton, a user interface prototyper. “Transitioning from thinking about larger gestural movements to smaller and potentially more intricate hand movements definitely comes with a new set of ideas.”

Robbie and other developers are just now getting access to a new product that brings hand tracking to a whole new level. The Leap Motion Controller was first announced in a video last May and it immediately went viral.

They want “people to
go far and wide and
do whatever they
can with it.”

Since then, they’ve been allowing third-party developers—including hackers and hobbyists—to try out the hardware and work with their SDK, taking a decidedly different tack than Microsoft initially took with the Kinect. At CES, I was able to take a closer look at the device and hear what the company thought of makers working with their product. Michael Zagorsek, the company’s Vice President of Product Marketing, said that they want “people to go far and wide and do whatever they can with it.”

The company offered me a unit to try out myself and gave me access to their developer portal. The SDK download includes the software drivers, libraries, a test application, and sample code for a few different programming languages including Java, C++, and Python. After connecting the controller via USB and launching the test application, it took a few adjustments before I had it working reliably. I didn’t realize how wide the field of view of the Leap Motion is, so my chin was being picked up by the sensor. According to my developer contact at Leap Motion, it’s likely that a differentiation can be made between fingers and chins within the software in the future. Right now, for instance, it can differentiate between fingers and tools like pens.

It was fairly easy to get the Python sample code running and get values for the number of hands and fingers in the view of the Leap Motion Controller. It can also get angle of the hands and the position of the fingers in a 3D space. I haven’t taken things much further than the example code, but I’m eager to continue to work with the technology.

YouTube player

Despite the fact that only a limited group of people have been given access to this powerful new tool, there’s still quite a few impressive examples of what’s being done with the Leap Motion. Earlier this month, we posted about Stephane Bersot, who used his controller to alter the sound of his digital musical instruments, as demonstrated in the video above. A Leap Motion orchestra formed during San Francisco Music Hack Day 2013, which concluded with a live performance of the instruments that were created within 24 hours. Of course, there’s more to the device than just making instruments. For example, a group of students from The University of Pennsylvania started working on a project which uses the controller to translate sign language into text.

“Think, for example, of a small
device comprised of a Leap and
a Raspberry Pi allowing users to
control objects in physical space
with no visible electronics.
It would be magical!”

I contacted Scott Garner, a creative technologist who has been experimenting with the Leap Motion for his own work. One of his projects is a web-based marionette that the user can control with his or her fingers. “Because of the nature of the device and the way it has been presented, most of my initial ideas were for screen-based interactions,” he said. “I think there are other opportunities, though, for headless operations in which gestures cause results in the real world. Think, for example, of a small device comprised of a Leap and a Raspberry Pi allowing users to control objects in physical space with no visible electronics. It would be magical!”

With all the excitement around hand tracking, it’s no surprise that Leap Motion isn’t the only company getting into the game. Code Laboratories in Las Vegas revealed a product called Duo, which is similar to the Leap Motion Controller.Duo One main difference is their plan to offer the device as a DIY kit. It’s not yet available, but they’re currently in the process of launching a Kickstarter. According to their site, “a successful Kickstarter project will allow us to provide the DUO as a DIY kit that you can build on your own and customize it in any way you choose.”

And while the Leap Motion isn’t yet available on store shelves, the company is accepting developer applications, which gets you access to the SDK and a developer unit to experiment with. Their developer program is not limited professionals, so if you have an idea for how you’d use the Leap Motion, it’s worth filling out an application. The company has also been hinting at an upcoming announcement regarding a ship date for consumers who are interested in buying the product. Whether you already have a Leap Motion or are eager to experiment with one, we’d like to know how you’d use hand tracking in your projects. Leave a comment below with your ideas.

Discuss this article with the rest of the community on our Discord server!

Matt Richardson is a San Francisco-based creative technologist and Contributing Editor at MAKE. He’s the co-author of Getting Started with Raspberry Pi and the author of Getting Started with BeagleBone.

View more articles by Matt Richardson