Nothing Special   »   [go: up one dir, main page]

Skip to main content

Guiding Eye Movements for Better Communication and Augmented Vision

  • Conference paper
Perception and Interactive Technologies (PIT 2006)

Abstract

This paper briefly summarises our results on gaze guidance such as to complement the demonstrations that we plan to present at the workshop. Our goal is to integrate gaze into visual communication systems by measuring and guiding eye movements. Our strategy is to predict a set of about ten salient locations and then change the probability for one of these candidates to be attended: for one candidate the probability is increased, for the others it is decreased. To increase saliency, in our current implementation, we show a natural-scene movie and overlay red dots very briefly such that they are hardly perceived consciously. To decrease the probability, for example, we locally reduce the temporal frequency content of the movie. We here present preliminary results, which show that the three steps of our above strategy are feasible. The long-term goal is to find the optimal real-time video transformation that minimises the difference between the actual and the desired eye movements without being obtrusive. Applications are in the area of vision-based communication, augmented vision, and learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Dorr, M., Böhme, M., Martinetz, T., Barth, E.: Gaze-contingent spatio-temporal filtering in a head-mounted display. In: André, E., Dybkjær, L., Minker, W., Neumann, H., Weber, M. (eds.) PIT 2006. LNCS (LNAI), vol. 4021, pp. 205–207. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  2. Meyer, A., Böhme, M., Martinetz, T., Barth, E.: A single-camera remote eye tracker. In: André, E., Dybkjær, L., Minker, W., Neumann, H., Weber, M. (eds.) PIT 2006. LNCS (LNAI), vol. 4021, pp. 208–211. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  3. Dorr, M., Böhme, M., Martinetz, T., Gegenfurtner, K.R., Barth, E.: Analysing and reducing the variability of gaze patterns on natural videos. In: Groner, M., Groner, R., Müri, R., Koga, K., Raess, S., Sury, P. (eds.) Proceedings of 13th European Conference on Eye Movements, p. 35 (2005)

    Google Scholar 

  4. Itti, L., Koch, C., Niebur, E.: A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 20, 1254–1259 (1998)

    Article  Google Scholar 

  5. Barth, E., Drewes, J., Martinetz, T.: Individual predictions of eye-movements with dynamic scenes. In: Rogowitz, B., Pappas, T. (eds.) Electronic Imaging 2003, SPIE, vol. 5007, pp. 252–259 (2003)

    Google Scholar 

  6. Böhme, M., Dorr, M., Krause, C., Martinetz, T., Barth, E.: Eye movement predictions on natural videos. Neurocomputing (in press, 2006)

    Google Scholar 

  7. Zetzsche, C., Barth, E.: Fundamental limits of linear filters in the visual processing of two-dimensional signals. Vision Research 30, 1111–1117 (1990)

    Article  Google Scholar 

  8. Mota, C., Barth, E.: On the uniqueness of curvature features. In: Baratoff, G., Neumann, H. (eds.) Dynamische Perzeption. Proceedings in Artificial Intelligence, vol. 9, pp. 175–178. Infix Verlag, Köln (2000)

    Google Scholar 

  9. Barth, E., Watson, A.B.: A geometric framework for nonlinear visual coding. Optics Express 7, 155–165 (2000)

    Article  Google Scholar 

  10. Jaehne, B., Haußecker, H., Geißler, P. (eds.): Handbook of Computer Vision and Applications. Academic Press, London (1999)

    MATH  Google Scholar 

  11. Böhme, M., Dorr, M., Krause, C., Martinetz, T., Barth, E.: Eye movement predictions on natural videos. Neurocomputing (in press, 2005)

    Google Scholar 

  12. Böhme, M., Dorr, M., Martinetz, T., Barth, E.: Gaze-contingent temporal filtering of video. In: Eye Tracking Research and Applications (ETRA) (in press, 2006)

    Google Scholar 

  13. Duchowski, A.T., Cournia, N., Murphy, H.: Gaze-contingent displays: A review. CyberPsychology & Behavior 7, 621–634 (2004)

    Article  Google Scholar 

  14. Kortum, P., Geisler, W.: Implementation of a foveated image coding system for image bandwidth reduction. In: Human Vision and Electronic Imaging, SPIE Proceedings, vol. 2657, pp. 350–360 (1996)

    Google Scholar 

  15. Geisler, W.S., Perry, J.S.: A real-time foveated multiresolution system for low-bandwidth video communication. In: Rogowitz, B., Pappas, T. (eds.) Human Vision and Electronic Imaging: SPIE Proceedings, pp. 294–305 (1998)

    Google Scholar 

  16. Perry, J.S., Geisler, W.S.: Gaze-contingent real-time simulation of arbitrary visual fields. In: Rogowitz, B.E., Pappas, T.N. (eds.) Human Vision and Electronic Imaging: Proceedings of SPIE, San Jose, CA, vol. 4662, pp. 57–69 (2002)

    Google Scholar 

  17. Dorr, M., Böhme, M., Martinetz, T., Barth, E.: Visibility of temporal blur on a gaze-contingent display. In: APGV 2005 ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization, pp. 33–36 (2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Barth, E., Dorr, M., Böhme, M., Gegenfurtner, K., Martinetz, T. (2006). Guiding Eye Movements for Better Communication and Augmented Vision. In: André, E., Dybkjær, L., Minker, W., Neumann, H., Weber, M. (eds) Perception and Interactive Technologies. PIT 2006. Lecture Notes in Computer Science(), vol 4021. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11768029_1

Download citation

  • DOI: https://doi.org/10.1007/11768029_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34743-9

  • Online ISBN: 978-3-540-34744-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics