Abstract
This paper proposes a novel approach to continuous Arabic Sign Language recognition. We use a dataset which contains 40 sentences composed from 80 sign language words. The dataset is collected using sensor-based gloves. We propose a novel set of features suitable for sensor readings based on covariance, smoothness, entropy and uniformity. We also propose a novel classification approach based on a modified polynomial classifier suitable for sequential data. The proposed classification scheme is modified to take into account the context of the feature vectors prior to classification. This is achieved through the filtering of predicted class labels using median and mode filtering. The proposed work is compared against a vision-based solution. The proposed solution is found to outperform the vision-based solution as it yields an improved sentence recognition rate of 85 %.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Shanableh, T., Assaleh, K.: User-independent recognition of Arabic sign language for facilitating communication with the deaf community. Digit. Signal Process. 21, 535–542 (2011)
Assaleh, K., Shanableh, T., Zourob, M.: Low complexity classification system for glove-based arabic sign language recognition. In: Huang, T., Zeng, Z., Li, C., Leung, C.S. (eds.) ICONIP 2012, Part III. LNCS, vol. 7665, pp. 262–268. Springer, Heidelberg (2012)
Mohandes, M., Deriche, M., Liu, J.: Image-based and sensor-based approaches to arabic sign language recognition. IEEE Trans. Hum. Mach. Syst. 44, 551–557 (2014)
Mohandes, M., A-Buraiky, S., Halawani, T., Al-Baiyat, S.: Automation of the Arabic sign language recognition. In: Proceedings of the 2004 International Conference on Information and Communication Technologies: From Theory to Applications, pp. 479–480 (2004)
Tubaiz, N., Shanableh, T., Assaleh, K.: Glove-based continuous Arabic sign language recognition in user-dependent mode. IEEE Trans. Hum. Mach. Syst. PP(99), 1–8 (2015). doi:10.1109/THMS.2015.2406692
Oz, C., Leu, M.C.: American sign language word recognition with a sensory glove using artificial neural networks. Eng. Appl. Artif. Intell. 24, 1204–1213 (2011)
Kong, W.W., Ranganath, S.: Towards subject independent continuous sign language recognition: a segment and merge approach. Pattern Recogn. 47, 1294–1308 (2014)
Gao, W., Fang, G., Zhao, D., Chen, Y.: A Chinese sign language recognition system based on SOFM/SRN/HMM. Pattern Recogn. 37, 2389–2402 (2004)
Sharjah City for Humanitarian Services (SCHS). http://www.schs.ae/indexs.aspx
Assaleh, K., Shanableh, T., Fanaswala, M., Amin, F., Bajaj, H.: Continuous Arabic sign language recognition in user dependent mode. J. Intell. Learn. Syst. Appl. 2, 19–27 (2010)
Starner, T., Weaver, J., Pentland, A.: Real-time American sign language recognition using desk and wearable computer based video. IEEE Trans. Pattern Anal. Mach. Intell. 20(12), 1371–1375 (1998)
Acknowledgement
The authors gratefully acknowledge the American University of Sharjah for supporting this research through grant FRG14-2-26.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Tuffaha, M., Shanableh, T., Assaleh, K. (2015). Novel Feature Extraction and Classification Technique for Sensor-Based Continuous Arabic Sign Language Recognition. In: Arik, S., Huang, T., Lai, W., Liu, Q. (eds) Neural Information Processing. ICONIP 2015. Lecture Notes in Computer Science(), vol 9492. Springer, Cham. https://doi.org/10.1007/978-3-319-26561-2_35
Download citation
DOI: https://doi.org/10.1007/978-3-319-26561-2_35
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-26560-5
Online ISBN: 978-3-319-26561-2
eBook Packages: Computer ScienceComputer Science (R0)