
Jeff Han gave an amazing talk and demo at Etech – pictured here, LCARS multi-tounch interface!
While touch sensing is commonplace for single points of contact, multi-touch systems enables a user to interact with a system with more than one finger at a time, allowing for the use of both hands along with chording gestures. These kinds of interactions hold tremendous potential for advances in efficiency, usability, and intuitiveness. Multi-touch systems are inherently also able to accommodate multiple users simultaneously, which is especially useful for collaborative scenarios such as interactive walls and tabletops.
- Video of multi-touch – Link.
- Photo gallery of the multi-touch gear – Link.
- Transcript of Jeff Han’s talk – Link.
- I shot some video, was far away – but here it is – Link.
Bonus: I talked to Jeff after his session and I think we’ll have a DIY version on MAKE soon!
50 thoughts on “The Future of Interfaces Is Multi-Touch”
Comments are closed.
I’m already in the middle of building my own multi-touch screen – IR leds arrived today :) Will be awesome to see a Make version!
If you have the appropriate ACM subscription or are prepared to pay $10 to buy it, you can download the paper (4-page PDF) which has more information about the construction here:
http://portal.acm.org/citation.cfm?id=1095034.1095054
I cannot wait to see the Maker version. I want one!
This is something many Mac laptop users are already quite fond of, albeit in a simpler form. New Mac laptop trackpads allow “two finger scrolling”. When using one finger on the trackpad you control the mouse. If you chord on the trackpad with two fingers you can scroll horizontally and vertically.
I too look forward to the Make version, the interfaces seen in Minority Report are getting much closer to a reality!
screw the mac trackpad, im thinking this is a way to homebrew one of these:
http://www.jazzmutant.com/lemur_overview.php
i’ve downloaded the ACM paper and the tech seems really friggin simple. i presume plans of some sort will be forthcoming?
Can you imagine using Photoshop with this? -)
If this is rear projection, the way it works is simple:
Coloured light (IR?) is shone into the edges of the display, and when you put your finger on the surface, it changes the refractive index at the surface and the image of your fingerprint will appear on the back of the panel. They then have a camera behind the panel to pick up where is being touched.
However, what would be really interesting is how they make it thin, like the Jazzmutant.
Bring on the guide!
ch424
Jeff Han’s display isn’t the first or the only way to do multi-touch – the interesting thing is that it’s so simple, scalable, and cheap to build with readily-available materials. The downside is it needs the extra space at the back with a projector and camera. The FTIR technique is essentially what you describe (it’s in the already linked FTIRtouch/FTIRsense pages and the ACM paper) with a few improvements.
Jazzmutant’s Lemur is based on very different tech – a standard LCD screen with a commercially produced touch surface overlaid onto it – since it doesn’t use the FTIR technique it can be as thin as the screen is.
Ok, now the only question i have is once you build one, what software do you use? I’m sure there isn’t a windows, linux, insertyourfavoriteoshere driver out there for this. Thats the one thing stopping me from getting on building this this weekend.
Any one out there feel like taking a stab at writing driver software for this? At least modifying some code to take advantage of this kick ass interface.
I’ve been planning software for a week or two now; first I have to get a simple DirectShow framegrabber routine working (my current camera is an IR-hacked webcam :) then I’m off and racing :)
…who knows, some of the FTIRtouch software might be made available for the Make howto? (hint, hint :)
Wow. The possibilities for making games with this is dazzling. Someone has been thinking about it.
Pelrun, thanks for the ACM link. (As the AmEx commercial says, “membership has its privileges”. For anyone thinking of building one that actually functions, the paper will save you a lot of time.) I’m curious how you coupled the LEDs to the glass/plexi? I was going to file and polish the ends to be smooth and then fix them with index-matching U/V glue. Also, did you source a band-pass IR filter?
Someone above asks about software to “drive” it. I’d opine that that’s what this research project is all about. The author makes the point that the screen is dead simple. *All* of the work is in the software. I will be using DirectShow for the image capture, OpenCV for the computer vision/tracking, and Ogre3D for graphics. While I’ll be working in Visual Studio on XP, Mac wouldn’t be a problem: OpenCV builds on any POSIX system.
Pelrun, thanks for the ACM link. (As the AmEx commercial says, “membership has its privileges”. For anyone thinking of building one of these multi-touch screens, the paper will save you a lot of time.) I’m curious how you coupled the LEDs to the glass/plexi? I was going to file and polish the ends to be smooth and then fix them with index-matching U/V glue. Also, did you source a band-pass IR filter?
Someone above asks about software to “drive” it. I’d opine that that’s what this research project is all about. The author makes the point that the screen is dead simple. *All* of the work is in the software. I will be using DirectShow for the image capture, OpenCV for the computer vision/tracking, and Ogre3D for graphics. While I’ll be working in Visual Studio on XP, Mac wouldn’t be a problem: OpenCV builds on any POSIX system.
Pelrun, thanks for the ACM link. (As the AmEx commercial says, “membership has its privileges”. For anyone thinking of building one of these multi-touch screens, the paper will save you a lot of time.) I’m curious how you coupled the LEDs to the glass/plexi? I was going to file and polish the ends to be smooth and then fix them with index-matching U/V glue. Also, did you source a band-pass IR filter?
Someone above asks about software to “drive” it. I’d opine that that’s what this research project is all about. The author makes the point that the screen is dead simple. *All* of the work is in the software. I will be using DirectShow for the image capture, OpenCV for the computer vision/tracking, and Ogre3D for graphics. While I’ll be working in Visual Studio on XP, Mac wouldn’t be a problem: OpenCV builds on any POSIX system.
Argh, 3 repeat posts. Sorry all.
I’d be happy to help with software if someone builds me one ;) I’m also experienced with Ogre3d. Is the paper available anywhere that doesn’t require registration? I’m too cheap to spend the 10$. What kind of material do they project the screen onto? I assume it’s something that absorbs regular light but lets IR pass thru..
Pelrun, any luck with your screen? I just got a rough prototype up and “running” and the sensitivity is extremely low. Here’s what I’ve done.
For the moment I’m using my Logitech Quickcam with the lights off ;-) With the gain of my camera jacked up ridiculously high, I’m seeing only very faint glow spots, and even then only within about 2″ (straight out) from a LED. Perhaps I am not dumping enough light near the critical angle? And I wonder if elliptical LEDs might be a good idea to get more coverage in the plane?
Thoughts anyone?
anyone know if these would be useful?:
ebay
Probably you just need more LED’s. Also, maybe the angle is a bit much. I’ve decided to give a go at making a screen too. So far I’ve hacked my webcam and I’m gonna pick up some acrylic tomorrow.
I’ve written up our experiments to date along with a few images. Anyone else having better success?
It struck me tonight that instead (or as well as) boosting the IR LED power, I should really look at the detection end – the webcam. A quick search revealed that webcams typically (universally?) have an IR filter in place! I’m looking forward to removing this tomorrow in the hope that I can more clearly observe the FTIR. dwallin33, is this what you meant by “hacking” your webcam?
Yeah absolutely.. I don’t think you’d be able to see any IR with the filter in there. It’s fairly easy to remove once you get the lense part open (I have a create webcam I’m using right now). If you haven’t, check out this page: http://www.hoagieshouse.com/IR/
Why don’t you email me at ruiner33 [_at_] hotmail . com maybe we can collaborate on the software or share some tips at least.
dwallin33, yes, I discovered that page last night. Removing the filter, I’m now getting beautiful FTIR-induced tracking spots. I also noticed what Han reports in his paper – the effect is greatly diminished if the user has dry skin.
Cool, glad to hear it! I just got the projector I ordered off ebay and my LED’s should be arriving any day now. I’ll be posting about my progress on my blog too: http://www.whitenoiseaudio.com/blog/
BTw, I have some ideas about how the software detection could work if you are interested.
For all those excited about this enough to look into creating their own table (which I am also doing). The best place to go is http://www.nuigroup.com/forums . They call themselves the natural user interface group and they have alot of information about building and coding for multitouch interfaces. There are libraries out that have some basic functionality.