27 February 2019

Gestures for Social Robots

Social robots are designed to communicate with human beings naturally, assisting them with a variety of tasks. The effective use of gestures could greatly enhance robot-human interactions, allowing robots to communicate both verbally and non-verbally. Researchers at Vrije Universiteit Brussel, in Belgium, have recently introduced a new approach based on a generic gesture method to study the influence of different design aspects. The method devised by this team of researchers could overcome difficulties in transferring gestures to robots of different shapes and configurations. Users can input a robot's morphological information and the tool will use this data to calculate the gestures for that robot. To ensure that their method would be applicable to different types of robots, the researchers drew inspiration from a human base model. This model consists of different chains and blocks, which are used to model the various rotational possibilities of humans.


The researchers assigned a reference frame to each joint block using the human base model as a reference to construct the general framework behind their method. As different features are important for different kinds of gestures, the method devised by the researchers is designed to work in two different modes, namely the block mode and end effector mode. The block mode is used to calculate gestures such as emotional expressions in instances when the overall arm placement is crucial. The end effector mode, on the other hand, calculates gestures in situations in which the position of the end-effector is important, such as during object manipulation or pointing. In their study, the researchers applied their method to the virtual model of a robot called Probo. They used this example to illustrate how their method could help to study the collocation of different joints and joint angle ranges in gestures.

More information: