Microsoft labs has shared this incredible video of a couple fascinating prototypes for haptic feedbac in VR. Lets just start off by saying that we’re all aware that these are bulky and imperfect. I’m not sharing this as “the next big thing in VR”. Rather, these prototypes from Microsoft labs are incredible examples of the kind of research being done and how DIY it really can be.

In this video we see two prototypes, NormalTouch and TextureTouch, for the details, you can find their entire research paper here.


This systems mounts an itty bitty stewart platform on the front of a controller. I’m guessing by the name that it uses the surface normals of the polygons to establish what your finger should feel. It physically moves your finger so that you can feel if an item is rigid, soft, moving, etc. It also takes feedback from the force you apply, as you can see when they’re moving the ball around.


This demo controller shows a tiny system of pins that are raised and lowered individually by servo. Their purpose is to present a low resolution texture display for your finger. As you trace a surface, this should allow you to feel minute differences.

But what about…

Of course, neither of these are perfect. They both suffer from a common problem in virtual reality where your perception can be fooled at a point, but in our current technology we can’t fool everything. For every item you simulate there is another item further up your kinematic chain that breaks the immersion. For example, even if we were to perfect the NormalTouch system, simply moving your hand beyond the bounds of the little platform breaks immersion. Even if we go further to simulate the entire hand, then the arm can break it. This will always be a problem till we can fully replace every sensory input to our brains.

With the frustrating futility of sensory simulation looming above us, it would be easy to just give up. I’m glad that people aren’t giving up though, these prototypes are fascinating and I’d love to play with them!