Empathy VR Telepresence Rig Uses Oculus Rift to Connect Distant Friends

Robotics
Empathy VR Telepresence Rig Uses Oculus Rift to Connect Distant Friends
From Japan to the U.S., Empathy VR lets you immerse yourself in the remote room.
From Japan to the U.S., Empathy VR lets you immerse yourself in the remote room.
This exhibit will be appearing at the 10th annual Maker Faire Bay Area. Don't have tickets yet? Get them here!
This exhibit will be appearing at the 10th annual Maker Faire Bay Area. Don’t have tickets yet? Get them here!

Tatsuki Adaniya started his Virtual Reality (VR) project as a way to keep in touch with his long distance girlfriend. He wanted something more personal than texting or even Skype. He wanted something that would let him feel like he was in the same room, sharing the same view as her. So he developed a telepresence application using Oculus Rift and a remote control robot.

Adaniya made the first prototype in a hackathon at Stanford University. Earlier this year, Adaniya met Shotaro Uchida at a meetup event that Uchida was hosting. The prototype attracted Uchida’s interest and eventually he joined the project. Working late into the night nearly every day, fueled by ramen noodles, the pair developed the Empathy VR telepresence application to interface a pair of Oculus Rift VR goggles with a remotely controlled camera robot.

Text
Adaniya is a VR UI/UX developer based in Tokyo.

The robot’s camera is controlled by two servo motors so it can pan and tilt. Using the Oculus Rift goggles, you can see whatever the camera is pointed at, and the camera’s motion is synchronized with the goggles. When you look right, the camera moves with you. If you nod, the camera nods too. They call this experience “tele-portation.”

The robot is battery powered and completely wireless, so it can be placed anywhere in the room or even carried around. The user on the end with the Oculus Rift goggles must be connected to a PC.

Adaniya is responsible for development of the user interface design and the desktop application software. He started developing VR applications using Oculus Rift since the release of its early developers kit. He has a year of experience with Unity software development. Last year, he demonstrated his VR application “Toy Story” at Pixar Inc.

Text
Uchida is an embedded Java/ZigBee Engineer, metal head, and beer geek.

Uchida joined the Empathy VR project 3 months ago. He is responsible for the hardware design and embedded system software development. Uchida has 7+ years experience with embedded system engineering skills.

Adaniya and Uchida want to set up several remote robots all over the world, so visitors can control those robots from wherever they exhibit their project. They also want to set up some robots at Maker Faire Bay Area so users can communicate with people around the event.

Check out the video below for a taste of the Empathy VR experience, or better yet, come to Maker Faire Bay Area and experience it for yourself.

 

What will the next generation of Make: look like? We’re inviting you to shape the future by investing in Make:. By becoming an investor, you help decide what’s next. The future of Make: is in your hands. Learn More.

Tagged

Andrew Terranova is an electrical engineer, writer and author of How Things Are Made: From Automobiles to Zippers. Andrew is also an electronics and robotics enthusiast and has created and curated robotics exhibits for the Children's Museum of Somerset County, NJ and taught robotics classes for the Kaleidoscope Enrichment in Blairstown, NJ and for a public primary school. Andrew is always looking for ways to engage makers and educators.

View more articles by Andrew Terranova
Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK