30 March 2012

Designing Human-Like Robots

Researchers at the University of Wisconsin-Madison are developing and creating various computer algorithms based on how people communicate without words. These algorithms are then used to program devices, like robots, to look and act more human-like, helping to bridge the gap between man and machine. Research also shows when you finish saying something in a conversation and your gaze is directed to one particular person, that person is likely to take the next turn speaking in the discussion. These nonverbal cues tell people where our attention is focused and what we mean when we direct a question or comment in a conversation. When people really mean what they’re saying, they might open up their eyes and look at who they’re talking to and really try to communicate their message or thought through facial and other cues.


To convert these subtle cues of human communication into data and language that can be used by a robot, researchers take a computational approach. They break down each human cue or gesture into minute segments or sub-mechanisms – such as the direction of the eyes versus the direction of the head or how the body is oriented – which can be modeled. Then, certain temporal dimensions are added to the model. These characteristics include the length of time a target is looked at and whether the gaze is focused on the face or should be directed elsewhere after a time. The research team has found learning improves when a robot teacher uses these cues, as opposed to a robot that doesn’t have these abilities. Their goal is to find the key mechanisms which help us communicate effectively, reproduce them in robots, and enable these systems to connect with us.

More information:

http://blogs.voanews.com/science-world/2012/03/23/designing-human-like-robots/