Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Comparison of Unobtrusive Visual Guidance Methods in an Immersive Dome Environment

Published: 19 September 2018 Publication History

Abstract

In this article, we evaluate various image-space modulation techniques that aim to unobtrusively guide viewers’ attention. While previous evaluations mainly target desktop settings, we examine their applicability to ultrawide field of view immersive environments, featuring technical characteristics expected for future-generation head-mounted displays. A custom-built, high-resolution immersive dome environment with high-precision eye tracking is used in our experiments. We investigate gaze guidance success rates and unobtrusiveness of five different techniques. Our results show promising guiding performance for four of the tested methods. With regard to unobtrusiveness we find that—while no method remains completely unnoticed—many participants do not report any distractions. The evaluated methods show promise to guide users’ attention also in a wide field of virtual environment applications, e.g., virtually guided tours or field operation training.

References

[1]
Reynold Bailey, Ann McNamara, Nisha Sudarsanam, and Cindy Grimm. 2009. Subtle gaze direction. ACM Transactions on Graphics 28, 4 (2009), 100.
[2]
Erhardt Barth, Michael Dorr, Martin Böhme, Karl Gegenfurtner, and Thomas Martinetz. 2006. Guiding the mind’s eye: Improving communication and vision by external control of the scanpath. In Electronic Imaging 2006. International Society for Optics and Photonics, 60570D--60570D.
[3]
Thomas Booth, Srinivas Sridharan, Ann McNamara, Cindy Grimm, and Reynold Bailey. 2013. Guiding attention in controlled real-world environments. In Proceedings of the ACM Symposium on Applied Perception. ACM, 75--82.
[4]
Katherine Breeden and Pat Hanrahan. 2017. Gaze data for the analysis of attention in feature films. ACM Transactions on Applied Perception 14, 4 (2017), 23.
[5]
Forrester Cole, Douglas DeCarlo, Adam Finkelstein, Kenrick Kin, R. Keith Morley, and Anthony Santella. 2006. Directing gaze in 3D models with stylized focus. In Proceedings of the 17th Eurographics Conference on Rendering Techniques. Eurographics Association, 377--387.
[6]
Raymond Dodge. 1900. Visual perception during eye movement. Psychological Review 7, 5 (1900), 454.
[7]
Michael Dorr, Thomas Martinetz, Karl Gegenfurtner, and Erhardt Barth. 2004. Guidance of eye movements on a gaze-contingent display. In Dynamic Perception Workshop of the GI Section “Computer Vision,” Uwe J. Ilg, Heinrich H. Bülthoff, and Hanspeter A. Mallot (Eds.). 89--94. https://scholar.harvard.edu/mdorr/publications/guidance-eye-movements-gaze-contingent-display.
[8]
Steve Grogorick, Michael Stengel, Elmar Eisemann, and Marcus Magnor. 2017. Subtle gaze guidance for immersive environments. In Proceedings of the ACM Symposium on Applied Perception. ACM, Article 4, 4:1--4:7.
[9]
Steve Grogorick, Matthias Überheide, Jan-Philipp Tauscher, and Marcus Magnor. 2018. ICG Real-Time Stereo Dome Projection System. Technical Report 1. Inst. f. Computergraphik, TU Braunschweig.
[10]
Aiko Hagiwara, Akihiro Sugimoto, and Kazuhiko Kawamoto. 2011. Saliency-based image editing for guiding visual attention. In Proceedings of the 1st International Workshop on Pervasive Eye Tracking 8 Mobile Eye-based Interaction. ACM, 43--48.
[11]
Hajime Hata, Hideki Koike, and Yoichi Sato. 2016. Visual guidance with unnoticed blur effect. In Proceedings of the International Working Conference on Advanced Visual Interfaces. ACM, 28--35.
[12]
Laurent Itti, Christof Koch, and Ernst Niebur. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 20, 11 (1998), 1254--1259.
[13]
Robert S. Kennedy, Norman E. Lane, Kevin S. Berbaum, and Michael G. Lilienthal. 1993. Simulator sickness questionnaire: An enhanced method for quantifying simulator sickness. The International Journal of Aviation Psychology 3, 3 (1993), 203--220.
[14]
Robert Kosara, Silvia Miksch, and Helwig Hauser. 2002. Focus+ context taken literally. IEEE Computer Graphics and Applications 22, 1 (2002), 22--29.
[15]
Ed Lantz and Brad Thompson. 2003. Large-scale immersive theatres. SIGGRAPH Courses Program. San Diego, CA (2003).
[16]
R. John Leigh and David S. Zee. 2015. The Neurology of Eye Movements. Vol. 90. Oxford University Press.
[17]
Yen-Chen Lin, Yung-Ju Chang, Hou-Ning Hu, Hsien-Tzu Cheng, Chi-Wen Huang, and Min Sun. 2017. Tell me where to look: Investigating ways for assisting focus in 360 video. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 2535--2545.
[18]
Andrei Lintu and Noëlle Carbonell. 2009. Gaze Guidance through Peripheral Stimuli. Retrieved from https://hal.inria.fr/inria-00421151 working paper or preprint.
[19]
Ann McNamara, Reynold Bailey, and Cindy Grimm. 2009. Search task performance using subtle gaze direction with the presence of distractions. ACM Transactions on Applied Perception 6, 3 (2009), 17.
[20]
Ruth Rosenholtz. 2016. Capabilities and limitations of peripheral vision. Annual Review of Vision Science 2 (2016), 437--457.
[21]
Ana Serrano, Vincent Sitzmann, Jaime Ruiz-Borau, Gordon Wetzstein, Diego Gutierrez, and Belen Masia. 2017. Movie editing and cognitive event segmentation in virtual reality video. ACM Transactions on Graphics 36, 4 (2017), 47.
[22]
Alia Sheikh, Andy Brown, Zillah Watson, and Michael Evans. 2016. Directing attention in 360-degree video. Retrieved April 2, 2018 from
[23]
V. Sitzmann, A. Serrano, A. Pavel, M. Agrawala, D. Gutierrez, B. Masia, and G. Wetzstein. 2018. Saliency in VR: How do people explore virtual environments? IEEE Transactions on Visualization and Computer Graphics 24, 4 (April 2018), 1633--1642.
[24]
Eduardo E. Veas, Erick Mendez, Steven K. Feiner, and Dieter Schmalstieg. 2011. Directing attention and influencing memory with visual saliency modulation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, 1471--1480.

Cited By

View all
  • (2024)HiveFive360: Extending the VR Gaze Guidance Technique HiveFive to Highlight Out-Of-FOV TargetsProceedings of Mensch und Computer 202410.1145/3670653.3670662(11-20)Online publication date: 1-Sep-2024
  • (2024)Vision-Based Assistive Technologies for People with Cerebral Visual Impairment: A Review and Focus StudyProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675637(1-20)Online publication date: 27-Oct-2024
  • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 15, Issue 4
October 2018
57 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/3280853
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 19 September 2018
Accepted: 01 June 2018
Revised: 01 June 2018
Received: 01 May 2018
Published in TAP Volume 15, Issue 4

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Virtual Reality
  2. dome
  3. eye tracking
  4. perception
  5. post-processing
  6. unobtrusive gaze guidance

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

  • German Science Foundation

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)51
  • Downloads (Last 6 weeks)5
Reflects downloads up to 26 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)HiveFive360: Extending the VR Gaze Guidance Technique HiveFive to Highlight Out-Of-FOV TargetsProceedings of Mensch und Computer 202410.1145/3670653.3670662(11-20)Online publication date: 1-Sep-2024
  • (2024)Vision-Based Assistive Technologies for People with Cerebral Visual Impairment: A Review and Focus StudyProceedings of the 26th International ACM SIGACCESS Conference on Computers and Accessibility10.1145/3663548.3675637(1-20)Online publication date: 27-Oct-2024
  • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
  • (2023)Unobtrusive interaction: a systematic literature review and expert surveyHuman–Computer Interaction10.1080/07370024.2022.216240439:5-6(380-416)Online publication date: Mar-2023
  • (2022)Two-Step Gaze GuidanceProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556612(299-309)Online publication date: 7-Nov-2022
  • (2022)The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended RealityACM Computing Surveys10.1145/349120755:3(1-39)Online publication date: 25-Mar-2022
  • (2022)Extended Reality Visual Guidance for Industrial Environments: A Scoping Review2022 IEEE 3rd International Conference on Human-Machine Systems (ICHMS)10.1109/ICHMS56717.2022.9980683(1-6)Online publication date: 17-Nov-2022
  • (2022)Strategies to reduce visual attention changes while learning and training in extended reality environmentsInternational Journal on Interactive Design and Manufacturing (IJIDeM)10.1007/s12008-022-01092-917:1(17-43)Online publication date: 11-Dec-2022
  • (2021)The Use of Embedded Context-Sensitive Attractors for Clinical Walking Test Guidance in Virtual RealityFrontiers in Virtual Reality10.3389/frvir.2021.6219652Online publication date: 29-Apr-2021
  • (2021)IlluminatedZoom: spatially varying magnified vision using periodically zooming eyeglasses and a high-speed projectorOptics Express10.1364/OE.42761629:11(16377)Online publication date: 12-May-2021
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media