Robotic dream playback

Robotics
Robotic dream playback

Dreaming_Robot.jpg

“Sleep Waking” by Fernando Orellana and Brendan Burns presents a new way to look back on one’s dreams. EEG, EKG, REM, and various other physical data is logged during the subject’s sleep and then later used as the script to direct robotic action –

The eye position data we simply apply to the position the robot’s heads is looking. So if my eye was looking left, the robot looks left.
The use of the EEG data is a bit more complex. Running it through a machine learning algorithm, we identified several patterns from a sample of the data set (both REM and non-REM events). We then associated preprogrammed robot behaviors to these patterns. Using the patterns like filters, we process the entire data set, letting the robot act out each behavior as each pattern surfaces in the signal. Periods of high activity (REM) where associated with dynamic behaviors (flying, scared, etc.) and low activity with more subtle ones (gesturing, looking around, etc.). The “behaviors” the robot demonstrates are some of the actions I might do (along with everyone else) in a dream.

What? No electric sheep? – Link

What will the next generation of Make: look like? We’re inviting you to shape the future by investing in Make:. By becoming an investor, you help decide what’s next. The future of Make: is in your hands. Learn More.

Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK