Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3379156.3391367acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Exploiting the GBVS for Saliency aware Gaze Heatmaps

Published: 02 June 2020 Publication History

Abstract

Analyzing visual perception in scene images is dominated by two different approaches: 1.) Eye Tracking, which allows us to measure the visual focus directly by mapping a detected fixation to a scene image, and 2.) Saliency maps, which predict the perceivability of a scene region by assessing the emitted visual stimulus with respect to the retinal feature extraction. One of the best-known algorithms for calculating saliency maps is GBVS. In this work, we propose a novel visualization method by generating a joint fixation-saliency heatmap. By incorporating a tracked gaze signal into the GBVS, the proposed method equilibrates the fixation frequency and duration to the scene stimulus, and thus visualizes the rate of the extracted visual stimulus by the spectator.

References

[1]
Peng Bian and Liming Zhang. 2008. Biological plausibility of spectral domain approach for spatiotemporal visual saliency. In International conference on neural information processing. Springer, 251–258.
[2]
Tanja Blascheck, Michael Burch, Michael Raschke, and Daniel Weiskopf. 2015. Challenges and perspectives in big eye-movement data visual analytics. In 2015 Big Data Visual Analytics (BDVA). IEEE, 1–8.
[3]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf, and Thomas Ertl. 2014. State-of-the-Art of Visualization for Eye Tracking Data. In EuroVis STAR. http://dx.doi.org/10.2312/eurovisstar.20141173
[4]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf, and Thomas Ertl. 2017. Visualization of eye tracking data: A taxonomy and survey. In Computer Graphics Forum, Vol. 36. Wiley Online Library, 260–284.
[5]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Stefan Strohmaier, Daniel Weiskopf, and Thomas Ertl. 2016. AOI Hierarchies for Visual Exploration of Fixation Sequences. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’16.
[6]
Michael Burch, Hansjörg Schmauder, Michael Raschke, and Daniel Weiskopf. 2014. Saccade Plots. In Proceedings of Symposium on Eye Tracking Research and Applications, ACM (Ed.). http://dx.doi.org/10.1145/2578153.2578205
[7]
Zoya Bylinskii, Michelle A Borkin, Nam Wook Kim, Hanspeter Pfister, and Aude Oliva. 2015. Eye fixation metrics for large scale evaluation and comparison of information visualizations. In Workshop on Eye Tracking and Visualization. Springer, 235–255.
[8]
Xiaowu Chen, Anlin Zheng, Jia Li, and Feng Lu. 2017. Look, perceive and segment: Finding the salient objects in images via two-stream fixation-semantic cnns. In Proceedings of the IEEE International Conference on Computer Vision. 1050–1058.
[9]
Thanh-Chung Dao, Roman Bednarik, and Hana Vrzakova. 2014. Heatmap rendering from large-scale distributed datasets using cloud computing. In Proceedings of the Symposium on Eye Tracking Research and Applications. 215–218.
[10]
Rachit Dubey, Joshua Peterson, Aditya Khosla, Ming-Hsuan Yang, and Bernard Ghanem. 2015. What makes an object memorable?. In Proceedings of the ieee international conference on computer vision. 1089–1097.
[11]
Andrew T Duchowski, Margaux M Price, Miriah Meyer, and Pilar Orero. 2012. Aggregate gaze visualization with real-time heatmaps. In Proceedings of the Symposium on Eye Tracking Research and Applications. 13–20.
[12]
David Geisler, Wolfgang Fuhl, Thiago Santini, and Enkelejda Kasneci. 2017. Saliency Sandbox-Bottom-up Saliency Framework. In VISIGRAPP (4: VISAPP). 657–664.
[13]
Jonathan Harel, C Koch, and P Perona. 2006. A saliency implementation in matlab. URL: http://www. klab. caltech. edu/harel/share/gbvs. php (2006).
[14]
Jonathan Harel, Christof Koch, and Pietro Perona. 2007. Graph-based visual saliency. In Advances in neural information processing systems. 545–552.
[15]
B. Hosp, S. Evazi, M. Maurer, W. Fuhl, and E. Kasneci. 2019. RemoteEye: An Open Source remote Eye Tracker. Behavior Research Methods, BRM (dec 2019).
[16]
Xiaodi Hou and Liqing Zhang. 2007. Saliency detection: A spectral residual approach. In Computer Vision and Pattern Recognition, 2007. CVPR’07. IEEE Conference on. IEEE, 1–8.
[17]
Laurent Itti and Christof Koch. 2000. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision research 40, 10-12 (2000), 1489–1506.
[18]
Laurent Itti, Christof Koch, and Ernst Niebur. 1998. A model of saliency-based visual attention for rapid scene analysis. IEEE Transactions on pattern analysis and machine intelligence 20, 11(1998), 1254–1259.
[19]
Vijay John, Keisuke Yoneda, B Qi, Zheng Liu, and Seiichi Mita. 2014. Traffic light recognition in varying illumination using deep learning and saliency map. In Intelligent Transportation Systems (ITSC), 2014 IEEE 17th International Conference on. IEEE, 2286–2291.
[20]
Iuliia Kotseruba and John K Tsotsos. 2018. 40 years of cognitive architectures: core cognitive abilities and practical applications. Artificial Intelligence Review(2018), 1–78.
[21]
Thomas Kübler, Wolfgang Fuhl, Raphael Rosenberg, Wolfgang Rosenstiel, and Enkelejda Kasneci. 2016. Novel methods for analysis and visualization of saccade trajectories. In European Conference on Computer Vision. Springer, 783–797.
[22]
Matthias Kümmerer, Lucas Theis, and Matthias Bethge. 2014. Deep gaze i: Boosting saliency prediction with feature maps trained on imagenet. arXiv preprint arXiv:1411.1045(2014).
[23]
Kuno Kurzhals, Marcel Hlawatsch, Michael Burch, and Daniel Weiskopf. 2016a. Fixation-image charts. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. 11–18.
[24]
Kuno Kurzhals, Marcel Hlawatsch, Florian Heimerl, Michael Burch, Thomas Ertl, and Daniel Weiskopf. 2015. Gaze stripes: Image-based visualization of eye tracking data. IEEE transactions on visualization and computer graphics 22, 1(2015), 1005–1014.
[25]
Kuno Kurzhals, Marcel Hlawatsch, Christof Seeger, and Daniel Weiskopf. 2016b. Visual analytics for mobile eye tracking. IEEE transactions on visualization and computer graphics 23, 1(2016), 301–310.
[26]
T. C. Kübler, K. Sippel, W. Fuhl, G. Schievelbein, J. Aufreiter, R. Rosenberg, W. Rosenstiel, and E. Kasneci. 2015. Analysis of eye movements with Eyetrace. Vol. 574. Biomedical Engineering Systems and Technologies. Communications in Computer and Information Science (CCIS). Springer International Publishing. 458–471 pages.
[27]
Gayoung Lee, Yu-Wing Tai, and Junmo Kim. 2016. Deep saliency with encoded low level distance map and high level features. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 660–668.
[28]
Guanbin Li and Yizhou Yu. 2015. Visual saliency based on multiscale deep features. In Proceedings of the IEEE conference on computer vision and pattern recognition. 5455–5463.
[29]
Nian Liu and Junwei Han. 2016. Dhsnet: Deep hierarchical saliency network for salient object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 678–686.
[30]
Anneli Olsen. 2012. The Tobii I-VT fixation filter. Tobii Technology (2012), 1–21.
[31]
Michael Raschke, Dominik Herr, Tanja Blascheck, Michael Burch, Michael Schrauf, Sven Willmann, and Thomas Ertl. 2014. A Visual Approach for Scan Path Comparison. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’14. ACM. http://dx.doi.org/10.1145/2578153.2578173
[32]
Michael Raschke, Bernhard Schmitz, Michael Wörner, Thomas Ertl, and Frederik Diederichs. 2016. Application Design for an Eye Tracking Analysis based on Visual Analytics. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’16. Rezensiertes Poster.
[33]
Ton Roosendaal. 2008. Big Buck Bunny. In ACM SIGGRAPH ASIA 2008 Computer Animation Festival(SIGGRAPH Asia ’08). Association for Computing Machinery, New York, NY, USA, 62. https://doi.org/10.1145/1504271.1504321
[34]
T. Santini, W. Fuhl, D. Geisler, and E. Kasneci. 2017b. EyeRecToo: Open-Source Software for Real-Time Pervasive Head-Mounted Eye-Tracking. In 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017).
[35]
T. Santini, W. Fuhl, and E. Kasneci. 2017a. CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems.
[36]
T. Santini, D. Niehorster, and E. Kasneci. 2019. Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Tracking. In Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA).
[37]
Daan R van Renswoude, Ingmar Visser, Maartje EJ Raijmakers, Tawny Tsang, and Scott P Johnson. 2019. Real-world scene perception in infants: What factors guide attention allocation?Infancy 24, 5 (2019), 693–717.
[38]
Raphael Yuster and Uri Zwick. 2005. Fast sparse matrix multiplication. ACM Transactions On Algorithms (TALG) 1, 1 (2005), 2–13.
[39]
Jianming Zhang and Stan Sclaroff. 2013. Saliency detection: A boolean map approach. In Proceedings of the IEEE international conference on computer vision. 153–160.
[40]
Qi Zhao and Christof Koch. 2013. Learning saliency-based visual attention: A review. Signal Processing 93, 6 (2013), 1401–1407.
[41]
Rui Zhao, Wanli Ouyang, Hongsheng Li, and Xiaogang Wang. 2015. Saliency detection by multi-context deep learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 1265–1274.
[42]
Wenbin Zou and Nikos Komodakis. 2015. Harf: Hierarchy-associated rich features for salient object detection. In Proceedings of the IEEE international conference on computer vision. 406–414.

Cited By

View all
  • (2024)(The limits of) eye-tracking with iPadsJournal of Vision10.1167/jov.24.7.124:7(1)Online publication date: 2-Jul-2024
  • (2023)Exploring the Effects of Scanpath Feature Engineering for Supervised Image Classification ModelsProceedings of the ACM on Human-Computer Interaction10.1145/35911307:ETRA(1-18)Online publication date: 18-May-2023
  • (2023)Leveraging Saliency-Aware Gaze Heatmaps for Multiperspective Teaching of Unknown Objects2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS55552.2023.10342312(7846-7853)Online publication date: 1-Oct-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '20 Short Papers: ACM Symposium on Eye Tracking Research and Applications
June 2020
305 pages
ISBN:9781450371346
DOI:10.1145/3379156
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 June 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Eye-Tracking
  2. Scene Evaluation
  3. Visual Perception
  4. Visual Stimulus

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

ETRA '20

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

ETRA '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)29
  • Downloads (Last 6 weeks)1
Reflects downloads up to 18 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)(The limits of) eye-tracking with iPadsJournal of Vision10.1167/jov.24.7.124:7(1)Online publication date: 2-Jul-2024
  • (2023)Exploring the Effects of Scanpath Feature Engineering for Supervised Image Classification ModelsProceedings of the ACM on Human-Computer Interaction10.1145/35911307:ETRA(1-18)Online publication date: 18-May-2023
  • (2023)Leveraging Saliency-Aware Gaze Heatmaps for Multiperspective Teaching of Unknown Objects2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS55552.2023.10342312(7846-7853)Online publication date: 1-Oct-2023
  • (2021)Saliency-Based Gaze Visualization for Eye Movement AnalysisSensors10.3390/s2115517821:15(5178)Online publication date: 30-Jul-2021
  • (2021)Glimpse: A Gaze-Based Measure of Temporal SalienceSensors10.3390/s2109309921:9(3099)Online publication date: 29-Apr-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media