10 Ways to Make Your Robot More Humanlike

Robotics
Copyright Jessica Lindsay, used with permission.
Copyright Jessica Lindsay, used with permission.

People are “attracted to […] robots not because of the way they look, but because of the way they behave.” – Mari Velonaki, Ph.D., Director of Creative Robotics Lab, Univ. of New South Wales.

Does a robot have to look like a human to be humanlike? For decades, Disney animators have brought the illusion of life and emotion to animals, objects and even machines.

With some simple programming, your robot, too – no matter its form – can boost its humanness quotient. Here are 10 techniques taken from a rich history of animation principles and human-robot interaction research.


If your robot has a head or eyes…

1) Blink.

Actor Haley Joel Osment has said that “Don’t blink” was an acting trick he used to transform himself into David, the uncanny robot boy in the movie A.I. Artificial Intelligence. According to this study, the average human blink rate is about 17 blinks per minute, while during conversation it increases to 26, and gets as low as 4.5 when reading. Check out this great blink animation tutorial to learn more about how to animate natural-looking blinks. For example, opening the eyes should take slightly longer than closing the eyes, and blinks can even be used to show intention and feeling.

In Henry Lejeune's portrait of Ophelia,  modesty is implied by the way she avoids looking at you.
Ophelia’s modesty is implied by her eyes.

2) Avoid staring.

Researchers have found that, while eye contact is important, gaze aversion can make a robot look more intentional, thoughtful, and creative. Some simple rules for where to place your robot’s gaze include:

  • look upwards when thinking
  • look away occasionally when speaking
  • look away in-between phrases, to show you’re not done speaking
  • look at the other person when you’re finished talking, to indicate it’s their turn to respond.

3) Head-turn with a blink and a tilt.

Stretch your arms straight out to your sides and turn your head to look at one hand, then the other. Did you notice that you blinked during the head turn? Animators follow this simple rule when turning the head:

  • add a blink, then
  • tilt the head down slightly in the middle of the turn.

The head dip is related to the animation principle of movement in arcs.


If your robot speaks…

4) Hedge a little.

Adding extra words called hedges (“maybe,” “probably” or “I think” ) and discourse markers (“You know,” “just,” “well,” “like” and “um.”) increases the likeability of robots.  Take, for example, this great line from Samantha, the artificially intelligent OS in the movie “Her”, where hedges and discourse markers appear 5 times: “Well, I was thinking, we don’t really have any photographs of us. And I thought this song could be like a photo that captures us in this moment in our life together.”

"I'm sorry, Dave. I'm afraid I can't do that." Sure you can't, Hal.
“I’m sorry, Dave. I’m afraid I can’t do that.”
Sure you can’t, Hal.

5) Match the tone of voice to the content of the words.

When HAL 9000 said, “I’m sorry Dave. I’m afraid I can’t do that,” why did it sound so creepy? One reason might be because its words didn’t match its tone of voice. Based on Table 2 in this paper, the acoustic profile of HAL’s famous words looks closer to “enjoyment” than apologetic “sadness”. HAL didn’t sound sorry at all! Beware text-to-speech systems that have happy voices as a default – a robot apology may come off as insincere.

6) Express feelings, opinions, preferences.

Script writers are pros at writing interesting characters. Part of it is to define the character’s backstory, including likes and dislikes, education, and so on. To use Spike Jonze’s Samantha character once more, the A.I. constantly expressed her opinion and feelings, and even openly laughed:

  • (Giving herself a name:) “I like the sound of it. Samantha.”
  • Samantha laughs, “Yeah, there are some funny ones!”
  • “Oh, I love this first one from Roger to his girlfriend. That’s so sweet.”
  • “You’re being very stubborn right now.

If your robot moves…

7) Move back before going forward, move down before going up.

In animation, this extra motion in the opposite direction is called anticipation. For example, in this clip with Wile-E Coyote (from 0:36), we see him pull backwards before launching forward in pursuit of the Road Runner. Similarly, a ballerina bends at the knees downwards before leaping up into the air. Does your robot anticipate its movements, or lurch on forward?

8) Lead with bigger joints.

Let’s say that your robot is going to point at something with its arm. Is it simply a case of moving the end-effector from A to B? How do you make that simple movement look more humanlike? In animation, you use something called successive breaking of joints. The idea is to use the bigger joints first, so in this case: first, move at the shoulder, then the elbow, then the wrist, then finally the fingers (if your robot has them). Try it yourself!

9) Use motion for emotion.

Your robot doesn’t have a face? No problem – it’s still possible to use other cues to convey emotion. For example, to convey happiness, have your robot make large, regular movements. For sadness, make your robot movements small and slow with even timings. To show fear, give the robot jerky, small and fast movements at irregular intervals, away from the source of fear. Large, irregular and abrupt movements can convey anger, especially towards the object of anger.

YouTube player

Finally…

10) Add randomness.

1/f or “pink” noise is found in many biological systems. It’s somewhat counter-intuitive to add randomness after making your robot controller as accurate as possible, but purposely adding randomness has been shown to increase the perception of humanness in computer systems. The Sibelius music composition software is also known to include random variations in volume to render its MIDI playback more natural. If possible, add randomness not only in interactions (e.g. different ways of saying  “yes”, such as “okay”, “sounds good”, or “got it”), but also in timing (e.g. blink timings) and movement in space (e.g. gaze aversion to different places). Interestingly, some new robots have mechanical randomness “built-in”, such as those in the field of soft robotics. Just think – what if your robot never acted the same way twice?


RobotWeek_Badge_bur02

This week marks the official launch of Make: Volume 39 — Robotics, which drops on newsstands the 27th. Be sure to grab a copy at a retailer near you, or subscribe online right now and never miss another issue.

We are celebrating with five days of robot-related articles, pictures, videos, reviews and projects. Tune into this space for Robot Week!

Our next theme week will be wearable electronics. Send us your tips or contributions before it gets here by dropping a line to editor@makezine.com.

 

10 thoughts on “10 Ways to Make Your Robot More Humanlike

  1. 10 Ways to Make Your Robot More Humanlike | Salute says:

    […] Read more on MAKE […]

  2. 10 façons de rendre votre robot humanoïde Plus | TechLab LR says:

    […] En … lire la suite (en anglais) […]

  3. cknich5 says:

    Reblogged this on NoCo Mini Maker Faire.

  4. Robot Overloards | The Lab says:

    […] 10 Ways to Make Your Robot More Humanlike […]

  5. Mark Stephen Meadows says:

    Great article, Angelica, thank you. #10 is my fave as it seems to verge towards what makes us most human – Imperfection. Maybe that’s #11? What’s that quote, “A man is just an imperfect robot?” Thanks again for writing this.

  6. jaslinthomas01 says:

    what a interesting post it is on Animation, nice on updating us by such amazing post.

  7. Robot Week Wrap-Up | MAKE says:

    […] Angelica Lim, Ph.D. […]

  8. Consuming Inspiration | Auxiliary Memory says:

    […] robots is becoming a mania.  Make Magazine even recommends “10 Ways to Make Your Robot More Humanlike.”  Building a robot teaches us about how bodies work.  Building an AI will teach us how […]

  9. Angelica Lim Works at the Intersection of Functional and Social Robots | MAKE says:

    […] passion is to improve the reputation of robots and make them more approachable and friendly, more humanlike. Her thesis work and scientific journal publications address how robots interact with humans […]

  10. Projet R.I.P.E.R. (Robotic Intelligent Platform for Entertainment and Research) | Le Labo : expériences autour de l'impression 3d, la robotique et autres projets DIY (Arduino…) says:

    […] « Humanisation » : permettre au robot d’exprimer des émotions (mouvements, yeux…), lui donner un comportement moins prévisible avec des micro gestes aléatoires (clignement des yeux…), des tics de langages etc. […]

Comments are closed.

Discuss this article with the rest of the community on our Discord server!
Tagged

Dr. Angelica Lim (@petitegeek) is a computer scientist and researcher specializing in A.I., robotics and emotion. Previously, she was User Experience Manager at Aldebaran Robotics.

View more articles by Angelica Lim, Ph.D.

ADVERTISEMENT

Maker Faire Bay Area 2023 - Mare Island, CA

Escape to an island of imagination + innovation as Maker Faire Bay Area returns for its 15th iteration!

Buy Tickets today! SAVE 15% and lock-in your preferred date(s).

FEEDBACK