02/05/2019 / By Edsel Cook
In Japan, the newest iteration of a social robot has become much better at controlling the features of its robotic face. This upgraded android child can make facial expressions that are subtler and more human-like.
The developers improved the expressiveness of the android’s face to make it easier for people to approach and communicate with. They are capitalizing on the Japanese fascination with robots of all shapes, sizes, and types, especially the humanoid ones.
Increasingly large numbers of advanced robots are appearing in various sectors of Japan’s economy and society. They are becoming just as capable as human workers without suffering from the same frailties like health and compensation.
However, one innately human capability that robots have not yet been able to fully replicate is facial expression. Even social robots with artificial intelligence are unable to move their facial features in a convincingly human way, much less an emotional one.
The human face can perform a wide variety of movements that are often asymmetrical. In comparison, mechanical systems are not capable of making the same fine motor movements of organic muscle.
Furthermore, the material used to make android skin is much less flexible than organic skin. The rigid material makes it even harder for robotic systems to move smoothly, much less more human-like. (Related: Will future psychologists be…robots?)
Osaka University researchers claimed they found a way to make it easier for androids to copy human facial expressions. Their method can select a facial movement that will be performed by the head of an android. It will evaluate the expression in terms of realism and improve the movement in the future.
They implemented their method in their Affetto social robot, a machine with the head of a child and a moving face. First debuting in 2011, the newest iteration of the android is considered to be much more expressive than its predecessor.
The researchers believe that their new method will unlock even broader ranges of facial expressions for androids in the near future. If a social robot can express human-like emotions, they think humans will have a much easier time interacting with that robot.
“Surface deformations are a key issue in controlling android faces,” the study authors explained in their paper, which they published in Frontiers in Robotics and AI. “Movements of their soft facial skin create instability, and this is a big hardware problem we grapple with. We sought a better way to measure and control it.”
In their study, the researchers evaluated 116 facial points on the Affetto robot. They used these points to chart out the robot’s movement in three dimensions.
Every facial point had a deformation unit beneath the skin. This was a small mechanical system that can change the shape of the face. For example, a deformation unit set below the lip or eyelid can move that part of the robotic face.
The researchers measured these minute movements. They entered the data in a mathematical model, which showed them how closely their robot matched the movements of a human. From there, they were able to refine the deformation.
They ran into trouble with the artificial skin and balancing out the force applied by the deformation. However, they also improved the fine motor control of Affetto, allowing the child robot to make much more subtle facial expressions.
“Our precise findings will let us effectively control android facial movements to introduce more nuanced expressions, such as smiling and frowning,” the researchers concluded.
Sources include:
Tagged Under: Androids, artificial intelligence, breakthrough, Emotions, facial expression, future science, future tech, humanoid robot, innovation, machine learning, robotics, robots, social robots