Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

Gestural interaction in the pervasive computing landscape

Interaktion durch Gesten im Pervasive Computing

  • Originalarbeit
  • Published:
e & i Elektrotechnik und Informationstechnik Aims and scope Submit manuscript

Pervasive computing has postulated to invisibly integrate technology into everyday objects in such a way, that these objects turn into smart things. Not only a single object of this kind is supposed to represent the interface among the "physical world" of atoms and the "digital world" of bits, but a whole landscapes of them. The interaction with such technology rich artefacts is supposed to be guided by their affordance, i.e. the ability of an artefact to express the modality of its appropriate use. We study human gesticulation and the manipulation of graspable and movable everyday artefacts as a potentially effective means for the interaction with smart things. In this work we consider gestures in the general sense of a movement or a state (posture) of the human body, as well as a movement or state of any physical object resulting from human manipulation. Intuitive "everyday"-gestures have been collected in a series of user tests, yielding a catalogue of generic body and artefact gesture dynamics. Atomic gestures are described by trajectories of orientation data, while composite gestures are defined by a compositional gesture grammar. The respective mechanisms for the recognition of such gestures have been implemented in a general software framework supported by an accelerometer-based sensing system. Showcases involving multiple gesture sensors demonstrate the viability of implicit embedded interaction for real life scenarios.

Pervasive Computing hat sich zum Ziel gesetzt, Technologie quasi "unsichtbar" in Gegenstände des alltäglichen Lebens zu integrieren. Dabei geht es nicht nur darum, dass einzelne Gegenstände eine Schnittstelle zwischen der physischen Welt der Atome und der digitalen Welt der Bits bilden, vielmehr zählt die Gesamtheit dieser "Smart Things" und wie sie miteinander interagieren. Die Autoren untersuchen die menschliche Gestik sowie die Manipulation von beweglichen Gegenständen als mögliche effektive Maßnahme zur Interaktion mit den "Smart Things". In diesem Beitrag erstreckt sich die Bezeichnung von "Gesten" sowohl auf Gesten im allgemeinen Sinn als Bewegung oder Zustand (Haltung) des menschlichen Körpers als auch auf die Bewegung oder den Zustand eines physischen Gegenstands aufgrund von menschlicher Manipulation. In einer Reihe von Tests wurden Gesten des alltäglichen Lebens gesammelt, die einen Katalog von generischen Körperbewegungen bzw. Bewegungen von Objekten ergeben haben. Einzelgesten werden als Sequenzen von Orientierungsdaten dargestellt, während zusammengesetzte Gesten durch ein eigenes Regelwerk beschrieben werden. Die entsprechenden Mechanismen wurden in ein Softwaresystem integriert. Sensormodelle zur Erfassung komplexer Gestik weisen bereits auf die Machbarkeit von Systemen zur implizierten eingebetteten Interaktion in Anwendungen hin.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Amft, O., Junker, H., Trster, G. (2005): Detection of eating and drinking arm gestures using inertial body-worn sensors. Proc. of the 9th IEEE Int. Symposium on Wearable Computers (ISWC'05): 160–163.

  • Bahlmann, C., Haasdonk, B., Burkhardt, H. (2002): On-line handwriting recognition with support vector machines. A kernel approach. In Proc. of the 8th Int. Workshop on Frontiers in Handwriting Recognition (IWFHR): 4954.

  • Benbasat, A. Y., Paradiso, J. A. (2001): An inertial measurement framework for gesture recognition and applications. Gesture Workshop, LNAI 2298: 9–20.

    Google Scholar 

  • Burges, C. J. C. (1998): A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2): 1–47.

    Article  Google Scholar 

  • Chang, C.-C., Lin, C.-J., LIBSVM (2001): A library for support vector machines. Software available at http://www.csie.ntu.edu.tw/cjlin/libsvm/.

  • Dourish, P. (2001): Where the action is: the foundations of embodied interaction. MIT Press, Cambridge.

    Google Scholar 

  • Fishkin, K., Moran, T., Harrison, B. (1998): Embodied user interfaces: towards invisible user interfaces. Proc. of the 7th int. conf. on engineering for human-computer interaction (EHCI'98), Heraklion, Crete, Greece, September 1998.

  • Fitzmaurice, G. W., Ishii, H., Buxton, W. (1995): Bricks: laying the foundations for graspable user interfaces. Proc. of the ACM conf. on human factors in computing systems (CHI'95), Denver, Colorado, May 1995.

  • Greenberg, S., Fitchett, C. (2001): Phidgets: easy development of physical interfaces through physical widgets. Proc. of ACM symposium on user interface software and technology (UIST 2001), Orlando, Florida, November 2001.

  • Holmquist, L. E., Redstrm, J., Ljungstrand, P. (1999): Token-based access to digital information. Proc. of the first int. symposium on handheld and ubiquitous computing (HUC'99), Karlsruhe, Germany, September 1999.

  • Koleva, B., Benford, S., Hui Ng, K., Rodden, T. (2003): A framework for tangible user interfaces. Proc. of the real world user interfaces workshop at the 5th int. symposium on human-computer interaction with mobile devices and services (MobileHCI 2003), Udine, Italy, September 2003.

  • Lementec, J. C., Bajcsy, P. (2004): Recognition of arm gestures using multiple orientation sensors: gesture classification. 7th Int. IEEE Conf. on Intelligent Transportation Systems, Washington, D.C., October 3–6 2004: 965–970.

  • Lenman, S., Bretzner, L., Thuresson, B. (2002): Computer vision based hand gesture interfaces for human-computer interaction. Technical Report TRITA-NA-D0209, 2002, CID-172, Royal Institute of Technology, Sweden.

  • Nielsen, M., Strring, M., Moeslund, T. B., Granum, E. (2003): A procedure for developing intuitive and ergonomic gesture interfaces for man-machine interaction. Technical Report CVMT 03-01, 2003, Aalborg University.

  • Shimodaira, H., Noma, K., Nakai, M., Sagayama, S. (2001): Dynamic time-alignment kernel in support vector machine. Advances in Neural Information Processing Systems 14, NIPS2001, 2: 921–928, December 2001.

    Google Scholar 

  • Ullmer, B., Ishii, H. (2000): Emerging frameworks for tangible user interfaces. IBM Syst 39 (3, 4): 915–931.

    Article  Google Scholar 

  • Westeyn, T., Brashear, H., Atrash, A., Starner, T. (2003): Georgia Tech Gesture Toolkit: Supporting experiments in gesture recognition. Proc. of ICMI 2003, November 5–7, Vancouver, British Columbia, Canada.

  • Williams, A., Kabisch, E., Dourish, P. (2005): From interaction to participation: Configuring space through embodied interaction. Proc. of the Ubicomp 2005, LNCS 3660: 287–304, September 2005.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ferscha, A., Resmerita, S. Gestural interaction in the pervasive computing landscape. Elektrotech. Inftech. 124, 17–25 (2007). https://doi.org/10.1007/s00502-006-0413-4

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00502-006-0413-4

Keywords

Schlüsselwörter

Navigation