Nothing Special   »   [go: up one dir, main page]

Skip to main content

A Complete System for the Specification and the Generation of Sign Language Gestures

  • Conference paper
  • First Online:
Gesture-Based Communication in Human-Computer Interaction (GW 1999)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1739))

Included in the following conference series:

Abstract

This paper describes a system called GeSsyCa which is able to produce synthetic sign language gestures from a high level specification. This specification is made with a language based both on a discrete description of space, and on a movement decomposition inspired from sign language gestures. Communication gestures are represented through symbolic commands which can be described by qualitative data, and traduced in terms of spatio-temporal targets driving a generation system. Such an approach is possible for the class of generation models controlled through key-points information. The generation model used in our approach is composed of a set of sensori-motor servo-loops. Each of these models resolves in real time the inversion of the servo-loop, from the direct specification of location targets, while satisfying psycho-motor laws of biological movement. The whole control system is applied to the synthesis of communication and sign language gestures, and a validation of the synthesized movements is presented.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Pascal Béchairaz and Daniel Thalmann. A model of nonverbal communication and interpersonal relationship between virtual actors. In Proceedings of Computer Animation’96, pages 58–67. IEEE Computer Society Press, June 1996.

    Google Scholar 

  2. Justin Cassel, C. Pelachaud, Norman I. Badler, M. Steedman, B. Achorn, T. Becket, B. Douville, S. Prevost, and M. Stone. Animated conversation: Rulebased generation of facial expression, gesture & spoken intonation for multiple conversational agents. In Proceedings of SIGGRAPH’94, 1994.

    Google Scholar 

  3. Moon-Ryul Jung, Norman Badler, and Noma Tsukasa. Animated human agents with motion planning capability for 3d-space postural goals. The Journal of Visualization and Computer Animation, 5:225–246, 1994.

    Article  Google Scholar 

  4. Thomas J. Smith, Jianping Shi, John P. Granieri, and Norman I. Badler. Jackmoo, an integration of jack and lambdamoo. In Pacific Graphics’97, 1997.

    Google Scholar 

  5. Norman I. Badler. Virtual humans for animation, ergonomics and simulation. In IEEE Workshop on non-rigid and articulated motion, June 1997.

    Google Scholar 

  6. Norman I. Badler, Bonnie Webber, Jugal Kalita, and Jeffrey Esakov. Animation from Instructions, chapter 3, pages 51–93. Morgan Kaufmann Publishers, 1991.

    Google Scholar 

  7. Sigmund Prillwitz, Regina Leven, Heiko Zienert, Thomas Hanke, and Jan et al. Henning. HamNoSys, version 2.0, Hamburg Notation System for Sign Languages-An introductory guide. Signum Press, 1989.

    Google Scholar 

  8. Valerie Sutton. The signwriting literacy project. In Impact of Deafness On Cognition AERA Conference, San Diego California, April 1998.

    Google Scholar 

  9. Annelies Braffort. Reconnaissance et compréhension de gestes, application à la langue des signes. PhD thesis, Université Paris-XI Orsay, June 1996.

    Google Scholar 

  10. Daniel Hernández. Qualitative representation of spatial knowledge, volume 80 of Lecture Notes in Artificial Intelligence. Springer-Verlag, 1994.

    Google Scholar 

  11. Sylvie Gibet and Pierre François Marteau. A self-organized model for the control, planning and learning of non-linear multi-dimensional system using a sensory feedback. Journal of Applied Intelligence, 4:337–349, 1994.

    Article  Google Scholar 

  12. Thierry Lebourque. Spécification et génération de gestes naturels. Application à la Langue des Signes Française. PhD thesis, Université Paris XI-Orsay, December 1998.

    Google Scholar 

  13. Bill Moody. La langue des signes. Dictionnaire bilingue élémentaire. Ellipses. Vol. 1, 1983, and Vol. 2, 1986.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lebourque, T., Gibet, S. (1999). A Complete System for the Specification and the Generation of Sign Language Gestures. In: Braffort, A., Gherbi, R., Gibet, S., Teil, D., Richardson, J. (eds) Gesture-Based Communication in Human-Computer Interaction. GW 1999. Lecture Notes in Computer Science(), vol 1739. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46616-9_20

Download citation

  • DOI: https://doi.org/10.1007/3-540-46616-9_20

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-66935-7

  • Online ISBN: 978-3-540-46616-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics