
The SixthSense project from the MIT Media Lab aims to seamlessly integrate digital information with our everyday physical world.
The hardware components are coupled in a pendant like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks user’s hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.
Though still very much in development, the device seems quite effective, using relatively little interface hardware – camera, projector, and gestural markers. The number of potential applications are a bit overwhelming. Imagine having the datasheet for a chip you’re working with automatically displayed in front of you — all without putting down your soldering iron ;)
8 thoughts on “SixthSense wearable data interface”
Comments are closed.
That’s one of the coolest things I have seen in a long time.
instead of wearing things on your fingers and around
your neck, better to wear a small watch like device
that reads your hand gestures at the source and can interface w/ your computer, entertainment center or
gaming console.
how to make my sixth sense perfect
I’d love to make one of these and play with it. How could I do it? Where can we get info on where to get the components?