Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3009939.3009952acmconferencesArticle/Chapter ViewAbstractPublication PagesissConference Proceedingsconference-collections
abstract
Public Access

Gaze-directed Immersive Visualization of Scientific Ensembles

Published: 06 November 2016 Publication History

Abstract

The latest advances in head-mounted displays (HMDs) for augmented reality (AR) and mixed reality (MR) have produced commercialized devices that are gradually accepted by the public. These HMDs are generally equipped with head tracking, which provides an excellent input to explore immersive visualization and interaction techniques for various AR/MR applications. This paper explores the head tracking function on the latest Microsoft HoloLens -- where gaze is defined as the ray starting at the head location and points forward. We present a gaze-directed visualization approach to study ensembles of 2D oil spill simulations in mixed reality. Our approach allows users to place an ensemble as an image stack in a real environment and explore the ensemble with gaze tracking. The prototype system demonstrates the challenges and promising effects of gaze-based interaction in the state-of-the-art mixed reality.

References

[1]
Amer Al-Rahayfeh and Miad Faezipour. 2013. Eye Tracking and Head Movement Detection: A State-of-Art Survey. IEEE Journal of Translational Engineering in Health and Medicine 1 (2013).
[2]
Fabian Beck, Tanja Blascheck, Thomas Ertl, and Daniel Weiskopf. 2015. Exploring Word-Sized Graphics for Visualizing Eye Tracking Data within Transcribed Experiment Recordings. In ETVIS2015.
[3]
T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. 2014. State-of-the-Art of Visualization for Eye Tracking Data. In EuroVis - STARs, R. Borgo, R. Maciejewski, and I. Viola (Eds.). The Eurographics Association.
[4]
B. Brinkman. 2012. Willing to be fooled: Security and autoamputation in augmented reality. In 2012 IEEE International Symposium on Mixed and Augmented Reality - Arts, Media, and Humanities (ISMAR-AMH). 89--90.
[5]
T. Chandler, M. Cordeil, T. Czauderna, T. Dwyer, J. Glowacki, C. Goncu, M. Klapperstueck, K. Klein, K. Marriott, F. Schreiber, and E. Wilson. 2015. Immersive Analytics. In Big Data Visual Analytics (BDVA), 2015. 1--8.
[6]
Henry Chen, Austin S Lee, Mark Swift, and John C Tang. 2015. 3D Collaboration Method over HoloLens™ and Skype™ End Points. In Proceedings of the 3rd International Workshop on Immersive Media Experiences. ACM, 27--30.
[7]
Adrian James Chung, Fani Deligianni, Xiao-Peng Hu, and Guang-Zhong Yang. 2004. Visual Feature Extraction via Eye Tracking for Saliency Driven 2D/3D Registration. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). 49--54.
[8]
J.C. Dietrich, C.J. Trahan, and etc. 2012. Surface trajectories of oil transport along the Northern Coastline of the Gulf of Mexico. Continental Shelf Research 41 (2012), 17--47.
[9]
S. Hillaire, A. Lecuyer, T. Regia-Corte, R. Cozot, J. Royan, and G. Breton. 2012. Design and Application of Real-Time Visual Attention Model for the Exploration of 3D Virtual Environments. IEEE Transactions on Visualization and Computer Graphics 18, 3 (2012), 356--368.
[10]
M. Hlawatsch, M. Burch, F. Beck, J. Freire, C. Silva, and D. Weiskopf. 2015. Visualizing the Evolution of Module Workflows. In 2015 19th International Conference on Information Visualisation. 40--49.
[11]
Youngmin Kim, Amitabh Varshney, David W. Jacobs, and François Guimbretière. 2010. Mesh Saliency and Human Eye Fixations. ACM Trans. Appl. Percept. 7, 2 (2010), 12:1--12:13.
[12]
A. Kotranza, D. Scott Lind, C. M. Pugh, and B. Lok. 2009. Real-time in-situ visual feedback of task performance in mixed environments for learning joint psychomotor-cognitive tasks. In Mixed and Augmented Reality, 2009. ISMAR 2009. 8th IEEE International Symposium on. 125--134.
[13]
Chang Ha Lee, Amitabh Varshney, and David W. Jacobs. 2005. Mesh Saliency. ACM Trans. Graph. 24, 3 (2005), 659--666.
[14]
S. Lee, G. J. Kim, and S. Choi. 2009. Real-Time Tracking of Visually Attended Objects in Virtual Environments and Its Application to LOD. IEEE Transactions on Visualization and Computer Graphics 15, 1 (2009), 6--19.
[15]
Aidong Lu, Ross Maciejewski, and David S. Ebert. 2010. Volume Composition and Evaluation Using Eye-tracking Data. ACM Trans. Appl. Percept. 7, 1 (2010), 4:1--4:20.
[16]
J.F. Mackworth and N.H. Mackworth. 1958. Eye fixations recorded on changing visual scenes by the television eye-marker. Journal of the Optical Society of America 48, 7 (1958), 439--444.
[17]
Thies Pfeiffer and Patrick Renner. 2014. EyeSee3D: A Low-cost Approach for Analyzing Mobile 3D Eye Tracking Data Using Computer Vision and Augmented Reality Technology. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). 195--202.
[18]
Dariusz Zapala and Bibianna Balaj. 2012. Eye Tracking and Head Tracking -- The two approaches in assistive technologies. (2012), 2406--2415.

Cited By

View all
  • (2021)Survey of Immersive AnalyticsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2019.292903327:3(2101-2122)Online publication date: 1-Mar-2021
  • (2017)[POSTER] HoloBee: Augmented Reality Based Bee Drift Analysis2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)10.1109/ISMAR-Adjunct.2017.38(87-92)Online publication date: Oct-2017
  • (2017)Augmented Reality Based Bee Drift Analysis: A User Study2017 International Symposium on Big Data Visual Analytics (BDVA)10.1109/BDVA.2017.8114581(1-8)Online publication date: Nov-2017

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ISS '16 Companion: Proceedings of the 2016 ACM Companion on Interactive Surfaces and Spaces
November 2016
136 pages
ISBN:9781450345309
DOI:10.1145/3009939
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 November 2016

Check for updates

Author Tags

  1. augmented reality
  2. gaze-based interaction
  3. immersive visualization
  4. scientific ensemble

Qualifiers

  • Abstract

Funding Sources

Conference

ISS '16
Sponsor:

Acceptance Rates

Overall Acceptance Rate 147 of 533 submissions, 28%

Upcoming Conference

ISS '24
Conference on Interactive Surfaces and Spaces
October 27 - 30, 2024
Vancouver , BC , Canada

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)74
  • Downloads (Last 6 weeks)14
Reflects downloads up to 01 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Survey of Immersive AnalyticsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2019.292903327:3(2101-2122)Online publication date: 1-Mar-2021
  • (2017)[POSTER] HoloBee: Augmented Reality Based Bee Drift Analysis2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct)10.1109/ISMAR-Adjunct.2017.38(87-92)Online publication date: Oct-2017
  • (2017)Augmented Reality Based Bee Drift Analysis: A User Study2017 International Symposium on Big Data Visual Analytics (BDVA)10.1109/BDVA.2017.8114581(1-8)Online publication date: Nov-2017

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media