Abstract
The use of sign language is a highly effective way of communicating with individuals who experience hearing loss. Despite extensive research, many learners find traditional methods of learning sign language, such as web-based question-answer methods, to be unengaging. This has led to the development of new techniques, such as the use of virtual reality (VR) and gamification, which have shown promising results. In this paper, we describe a gamified immersive American Sign Language (ASL) learning environment that uses the latest VR technology to gradually guide learners from numeric to alphabetic ASL. Our hypothesis is that such an environment would be more engaging than traditional web-based methods. An initial user study showed that our system scored highly in some aspects, especially the hedonic factor of novelty. However, there is room for improvement, particularly in the pragmatic factor of dependability. Overall, our findings suggest that the use of VR and gamification can significantly improve engagement in ASL learning.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bantupalli, K., Xie, Y.: American sign language recognition using deep learning and computer vision. In: 2018 IEEE International Conference on Big Data (Big Data), pp. 4896–4899. IEEE (2018)
Battistoni, P., Di Gregorio, M., Sebillo, M., Vitiello, G.: AI at the edge for sign language learning support. In: 2019 IEEE International Conference on Humanized Computing and Communication (HCC), pp. 16–23. IEEE (2019)
Bheda, V., Radpour, D.: Using deep convolutional networks for gesture recognition in american sign language. arXiv preprint arXiv:1710.06836 (2017)
Bialystok, E., et al.: Bilingualism in Development: Language, Literacy, and Cognition. Cambridge University Press, Cambridge (2001)
Bradski, G., Kaehler, A.: OpenCV. Dr. Dobb’s J. Softw. Tools 3, 120 (2000)
Bragg, D., Caselli, N., Gallagher, J.W., Goldberg, M., Oka, C.J., Thies, W.: ASL sea battle: gamifying sign language data collection. In: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2021)
Camgoz, N.C., Koller, O., Hadfield, S., Bowden, R.: Sign language transformers: Joint end-to-end sign language recognition and translation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 10023–10033 (2020)
Dillon, J.V., et al.: TensorFlow distributions. arXiv preprint arXiv:1711.10604 (2017)
Economou, D., Russi, M.G., Doumanis, I., Mentzelopoulos, M., Bouki, V., Ferguson, J.: Using serious games for learning British sign language combining video, enhanced interactivity, and VR technology. J. Univ. Comput. Sci. 26(8), 996–1016 (2020)
Goswami, Tilottama, Javaji, Shashidhar Reddy: CNN model for American sign language recognition. In: Kumar, Amit, Mozar, Stefan (eds.) ICCCE 2020. LNEE, vol. 698, pp. 55–61. Springer, Singapore (2021). https://doi.org/10.1007/978-981-15-7961-5_6
Jiang, X., Hu, B., Chandra Satapathy, S., Wang, S.H., Zhang, Y.D.: Fingerspelling identification for Chinese sign language via AlexNet-based transfer learning and Adam optimizer. Sci. Program. 2020, 1–13 (2020)
Kim, S., Ji, Y., Lee, K.B.: An effective sign language learning with object detection based ROI segmentation. In: 2018 Second IEEE International Conference on Robotic Computing (IRC), pp. 330–333. IEEE (2018)
Pallavi, P., Sarvamangala, D.: Recognition of sign language using deep neural network. Int. J. Adv. Res. Comput. Sci. 12, 92–97 (2021)
Python, Why: Python. Python Releases for Windows 24 (2021)
Samonte, M.J.C.: An assistive technology using FSL, speech recognition, gamification and online handwritten character recognition in learning statistics for students with hearing and speech impairment. In: Proceedings of the 2020 the 6th International Conference on Frontiers of Educational Technologies, pp. 92–97 (2020)
Schnepp, J., Wolfe, R., Brionez, G., Baowidan, S., Johnson, R., McDonald, J.: Human-centered design for a sign language learning application. In: Proceedings of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments, pp. 1–5 (2020)
Schrepp, M., Hinderks, A., Thomaschewski, J.: Applying the user experience questionnaire (UEQ) in different evaluation scenarios. In: Marcus, A. (ed.) DUXU 2014. LNCS, vol. 8517, pp. 383–392. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-07668-3_37
Schrepp, M., Thomaschewski, J., Hinderks, A.: Construction of a benchmark for the user experience questionnaire (UEQ). Int. J. Interact. Multimed. Artif. Intell. 4(4), 40–44 (2017)
Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., Shi, L.: User-defined hand gesture interface to improve user experience of learning American sign language. In: Frasson, C., Mylonas, P., Troussas, C. (eds.) ITS 2023. LNCS, vol. 13891, pp. 479–490. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-32883-1_43
Zhang, F., et al.: MediaPipe hands: on-device real-time hand tracking. arXiv preprint arXiv:2006.10214 (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, J., Ivrissimtzis, I., Li, Z., Zhou, Y., Shi, L. (2023). Developing and Evaluating a Novel Gamified Virtual Learning Environment for ASL. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14142. Springer, Cham. https://doi.org/10.1007/978-3-031-42280-5_29
Download citation
DOI: https://doi.org/10.1007/978-3-031-42280-5_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-42279-9
Online ISBN: 978-3-031-42280-5
eBook Packages: Computer ScienceComputer Science (R0)