Fig 3 - uploaded by Kentaro Takemura
Content may be subject to copyright.
Context in source publication
Citations
... The android's facial expression is controlled using 11 actuators. For the detailed explanation about the android and the experimental setup please refer to [5]. The android's feature point positions are captured using a motion capture system for gathering training data for training the forward kinematics ANN in section 2.1 (see Fig. 1). ...
The ability of androids to display facial expressions is a key factor towards more natural human-robot interaction. However, controlling the facial expressions of such robots with elastic facial skin is difficult due to the complexity of model-ing the skin deformation. We propose a method to solve the inverse kinematics of android faces to control the android's facial expression using target feature points. In our method, we use an artificial neural network to model the forward kinematics and minimizing a weighted squared error function for solving the inverse kinemat-ics. We then implement an inverse kinematics solver and evaluate our method using an actual android.