Content uploaded by Julian Szymanski
Author content
All content in this area was uploaded by Julian Szymanski on Aug 16, 2019
Content may be subject to copyright.
8th International Conference
Zakopane, Poland, June 18 – 21, 2013
Automatic classification of
static and dynamic
words in sign language
Tomasz Dziubich, Julian Szymanski
Gdask University of Technology
Faculty of Electronics, Telecommunications and Informatics
{tomasz.dziubich, julian.szymanski}@eti.pg.gda.pl
In the article we present the approach to automatic recognition of hand gestures. The
research results have been used to create the system for detection of words of Polish
sign language that is widely used in communication of deaf people. The proposed
approach opens additional ways for building multimodal Human-Machine Interactions by
processing information acquired from body movements monitoring.
We present detailed architecture of constructed device called eGlove that has been
built using set of wireless sensors. Its main element is Printed Circuit Board with ATmega
128 microcontroler and Bluetooth module (FLC-BTM403) mounted in a special fabric
glove on its upper surface. The device allows to precisely detect the current position of a
hand trought analysing the signals from magnetometer and tri-axis accelometers. The
signals from the sensors transferred to the computer have been used to construct
computational model of human hand movements. Analysis of signals from eGlove allows
us to build the classifier that detects particular patterns of hand movements that we used
for recognition of words in sign language.
We give a brief description of current state-of art methods used to sign language
detection and provide detailed description of our system as well as we report used
evaluation methodology. We evaluate our approach measured in terms of classification
quality and compare achieved results to the results given in the literature.
The results indicate the usage of eGlove significantly improve the hand gesture
recognition quality. In conclusion section we propose also the extension of our system by
adding RGB cameras from Microsoft Kinect which allows multimodal analysis of the body
language. Promising results of application of eGlove for hand gestures recognition opens
other areas of its usage, eg. in medical area where device can be used for predicting
epileptic seizures.
References
[1] Tutenel T., Smelik R. M., Lopes R., de Kraker D. J., Bidarra R.: Generating Consistent Buildings:
a Semantic Approach for Integrating Procedural Techniques, IEEE Transactions on
Computational Intelligence and AI in Games, 2011.
[2] Umeyama S.: Least-Squares Estimation of Transformation. Parameters Between Two Point
Patterns, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, no. 4, 1991.
[3] Sylvie CW Ong and Surendra Ranganath. Automatic sign language analysis: A survey and the
future beyond lexical meaning. IEEE Transactions on Pattern Analysis and Machine Intelligence,
27 (6): 873–891, 2005.
[4] Stephan Liwicki and Mark Everingham. Automatic recognition of fingerspelled words in british
sign language. Computer Vision and Pattern Recognition Workshops, 2009. CVPR Workshops
2009. IEEE Computer Society Conference on, pages 50–57. IEEE, 2009.
[5] Christian Vogler and Dimitris Metaxas. A framework for recognizing the simultaneous aspects of
american sign language. Computer Vision and Image Understanding, 81 (3): 358–384, 2001.