Robotic dream playback

Robotics
Robotic dream playback

Dreaming_Robot.jpg

“Sleep Waking” by Fernando Orellana and Brendan Burns presents a new way to look back on one’s dreams. EEG, EKG, REM, and various other physical data is logged during the subject’s sleep and then later used as the script to direct robotic action –

The eye position data we simply apply to the position the robot’s heads is looking. So if my eye was looking left, the robot looks left.
The use of the EEG data is a bit more complex. Running it through a machine learning algorithm, we identified several patterns from a sample of the data set (both REM and non-REM events). We then associated preprogrammed robot behaviors to these patterns. Using the patterns like filters, we process the entire data set, letting the robot act out each behavior as each pattern surfaces in the signal. Periods of high activity (REM) where associated with dynamic behaviors (flying, scared, etc.) and low activity with more subtle ones (gesturing, looking around, etc.). The “behaviors” the robot demonstrates are some of the actions I might do (along with everyone else) in a dream.

What? No electric sheep? – Link

Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Maker Faire Bay Area 2023 - Mare Island, CA

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 15th iteration!

Buy Tickets today! SAVE 15% and lock-in your preferred date(s).

FEEDBACK