Press Release – University of Canterbury
University of Canterbury (UC) research has found people can control a humanoid robot, interacting without holding a device, just by using body gestures.UC research finds people can control humanoid robots just by using body gesture
November 22, 2012
University of Canterbury (UC) research has found people can control a humanoid robot, interacting without holding a device, just by using body gestures.
UC postgraduate researcher Mohammad Obaid said they had been researching how to allow users to define their preferences on what they thought the most intuitive gestures required to navigate humanoid robots.
“We have defined a set of body gestures to control a humanoid robot based on users’ preferences. Integrating the defined gestures into the sensor programmes of humanoid robots may allow users to interact with a robot using full body gestures, which in turn will enhance the robots usability,’’ Dr Obaid said.
“Robots are already being used in several public domain areas, such as house cleaning, entertainment, elderly care and education. We recently made our math teaching robot app available online: http://hitlabnz.org/index.php/news/3-news/280-nao-math-teacher-app-released-in-robot-app-store
“At the UC Hit Lab our researchers are investigating the use of humanoid robots to teach kids mathematics and we anticipate that with this kind of functionality and as humanoid robots becoming cheaper we will see more robots being used by the public in the near future.’’
Humanoid robots are machines that share body shapes and characteristics similar to humans. Their movement is simulated to operate in a human-like fashion, such as walking and gazing.
The humanoid robot characteristics have allowed UC and Human Centered Multimedia Lab, Augsburg, (Germany) researchers to look into new ways of interacting and operating them.
Generally, researchers in the field of human-robot interaction aim at controlling a humanoid robot in the most intuitive way to enhance the user’s natural experience and engagement with the robot.
“Our work focuses on allowing human users to express how they would operate the navigation of a humanoid robot in an intuitive and natural way using non-verbal communications (full body gestures).
“We allow the user to be involved in the process of defining how they like to navigate a humanoid robot. Therefore, we conducted a research study to define a set of gestures for the navigational control of a humanoid robot.
“We analysed data from 35 participants who performed 385 gestures for eleven navigational commands such as – forward, backward, turn right, turn left, move right, move left, speed up, slow down, stand up, sit down and stop. The analysis of the data revealed a set of gestural commands to control a humanoid robot.
“Most humanoid robots have sensors integrated in them and with our work we can make robots more user friendly by employing our findings into the sensor programs of the robot and allowing them to sense human gestural commands.’’
Dr Obaid was recently nominated for the best paper award at the International Conference on Social Robotics in Chengdu, China.