MIT’s Reality Editor Controls IoT Devices via Augmented Reality

Arduino Augmented / Virtual Reality Internet of Things Raspberry Pi
MIT’s Reality Editor Controls IoT Devices via Augmented Reality
The Reality Editor (Credit: Fluid Interfaces/MIT)
The Reality Editor exposes the hidden capabilities and interfaces of devices. Photo courtesy of Fluid Interface Group/MIT

Augmented reality is one of those technologies that never really seemed to have its day. While the arrival of the smartphone led to a surge of popularity, walking the streets while holding your phone at arm’s length was never really going to take off. However, it’s possible that the arrival of the Internet of Things, which is still frantically searching for a workable interface paradigm, will prove to be augmented reality’s killer application. If so, the Fluid Interfaces group at MIT might well be on to something with their latest project, the Reality Editor.

The Reality Editor picks up on real world visual markers, using their position to overlay digital content into an augmented reality interface. This allows you to use virtual controls to tap into the capabilities of the Internet of Things. If something like this was picked up by manufacturers, it could provide the compatibility layer, and link, between devices that the Internet of Things has lacked until now.

One of the first Augmented Reality applications to go live in the Apple App Store was the "Nearest Tube"application. (Credit: acrossair)
One of the first augmented reality applications to appear in Apple’s new App Store was the “Nearest Tube” application. Photo courtesy of acrossair

One of the first augmented reality applications to appear in Apple’s brand new App Store was the Nearest Tube application. It typified location-aware augmented reality, where objects are injected into the real-world view based on their location relative to your own position, as opposed to marker-based augmented reality like the Reality Editor, where the real-world view is interpreted in real or near to real time and objects are placed in the view based on markers or other characteristics of the image.

Location-aware augmented reality applications proliferated on smartphones because they made use of their then-new GPS capabilities, and now marker-based augmented realities offer similar levers on the new capabilities of the Internet of Things.

As our computing diffuses into our environment, the interfaces to our computing will have to change. Currently there is a huge debate as to which types of interfaces will work. There are those that feel that it will be more of the same, more screens with more buttons, and those that feel — like David Rose who wrote a book called Enchanted Objects — that the objects themselves will have to become the interface. Reality Editor seems to be a workable compromise between the two waring camps.


Reality Editor. Courtesy of Fluid Interfaces/MIT

The Reality Editor allows you to point the camera of your smartphone at an object to expose and edit its capabilities. It allows you to drag a line from one object to another to create a new relationship between these objects — for instance, connecting a single switch to a light, or a group of lights. Effectively it lets you edit your reality and manipulate the way objects control and interact with one another. Interestingly, the same objects could theoretically have different relationships for other people. Everyone’s reality isn’t necessarily the same.

This approach hides the interfaces and relationships of smart objects away from our day-to-day world. This works because these relationships, from switch to light, aren’t needed every time you interact with the object. We don’t necessarily need to see all the capabilities and options every time we flip a switch to turn on a light. Not every choice needs to be presented to the user all the time.


The Arduino Yún “Hello World” example. Courtesy of Fluid Interfaces/MIT

However what’s most interesting about MIT’s new Reality Editor is that we can play with it. You can download the Reality Editor on the App Store and use it alongside MIT’s Open Hybrid framework to build your own “enchanted objects” using the Arduino Yún and the Raspberry Pi.

Beyond this — especially if technologies like Magic Leap materialise as advertised and we can dispense with the clunky phone interfaces — you can see this sort of technology driving more mainstream adoption of the Internet of Things which, until now, has been going down the wrong road when it comes to how it interacts with it users.

Perhaps then, with the Internet of Things, augmented reality has finally found its killer app?

What will the next generation of Make: look like? We’re inviting you to shape the future by investing in Make:. By becoming an investor, you help decide what’s next. The future of Make: is in your hands. Learn More.

Tagged

Alasdair Allan is a scientist, author, hacker and tinkerer, who is spending a lot of his time thinking about the Internet of Things. In the past he has mesh networked the Moscone Center, caused a U.S. Senate hearing, and contributed to the detection of what was—at the time—the most distant object yet discovered.

View more articles by Alasdair Allan
Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK