Robotics self modeling – Stargate replicators?

Robotics
Robotics self modeling – Stargate replicators?

Starfish Walking
Cornell University is developing these little bots that can change their configuration based on their environment, or when damaged figure out another way to get around…

“We demonstrate, both computationally and experimentally, how a legged robot automatically synthesizes a predictive model of its own topology (where and how its body parts are connected) through limited yet self-directed interaction with its environment, and then uses this model to synthesize successful new locomotive behavior before and after damage. The legged robot learned how to move forward based on only 16 brief self-directed interactions with its environment. These interactions were unrelated to the task of locomotion, driven only by the objective of disambiguating competing internal models.” [via] – Link & photos/movies.

Related:

  • Robotics @ MAKE – Link.

What will the next generation of Make: look like? We’re inviting you to shape the future by investing in Make:. By becoming an investor, you help decide what’s next. The future of Make: is in your hands. Learn More.

Tagged

current: @adafruit - previous: MAKE, popular science, hackaday, engadget, fallon, braincraft ... howtoons, 2600...

View more articles by Phillip Torrone
Discuss this article with the rest of the community on our Discord server!

ADVERTISEMENT

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 16th iteration!

Prices Increase in....

Days
Hours
Minutes
Seconds
FEEDBACK