Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

BubbleView: An Interface for Crowdsourcing Image Importance Maps and Tracking Visual Attention

Published: 13 November 2017 Publication History

Abstract

In this article, we present BubbleView, an alternative methodology for eye tracking using discrete mouse clicks to measure which information people consciously choose to examine. BubbleView is a mouse-contingent, moving-window interface in which participants are presented with a series of blurred images and click to reveal “bubbles” -- small, circular areas of the image at original resolution, similar to having a confined area of focus like the eye fovea. Across 10 experiments with 28 different parameter combinations, we evaluated BubbleView on a variety of image types: information visualizations, natural images, static webpages, and graphic designs, and compared the clicks to eye fixations collected with eye-trackers in controlled lab settings. We found that BubbleView clicks can both (i) successfully approximate eye fixations on different images, and (ii) be used to rank image and design elements by importance. BubbleView is designed to collect clicks on static images, and works best for defined tasks such as describing the content of an information visualization or measuring image importance. BubbleView data is cleaner and more consistent than related methodologies that use continuous mouse movements. Our analyses validate the use of mouse-contingent, moving-window methodologies as approximating eye fixations for different image and task types.

Supplementary Material

a36-kim-supp.pdf (kim.zip)
Supplemental movie, appendix, image and software files for, BubbleView: An Interface for Crowdsourcing Image Importance Maps and Tracking Visual Attention

References

[1]
Amer Al-Rahayfeh and Miad Faezipour. 2013. Eye tracking and head movement detection: A state-of-art survey. IEEE Journal of Translational Engineering in Health and Medicine 1 (2013).
[2]
Shumeet Baluja and Dean Pomerleau. 1994. Non-intrusive gaze tracking using artificial neural networks. Technical Report. Carnegie Mellon Univ., Pittsburgh, PA, USA.
[3]
Roman Bednarik and Markku Tukiainen. 2005. Effects of display blurring on the behavior of novices and experts during program debugging. In Proceedings of CHI’05 Extended Abstracts on Human Factors in Computing Systems (CHI EA’05). ACM, New York, NY, 1204--1207.
[4]
Roman Bednarik and Markku Tukiainen. 2007. Validating the restricted focus viewer: A study using eye-movement tracking. Behavior Research Methods 39, 2 (2007), 274--282.
[5]
Jennifer Romano Bergstrom and Andrew Schall. 2014. Eye Tracking in User Experience Design. Elsevier.
[6]
Alan F. Blackwell, Anthony R. Jansen, and Kim Marriott. 2000. Restricted Focus Viewer: A Tool for Tracking Visual Attention. Springer, Berlin, 162--177.
[7]
Ali Borji and Laurent Itti. 2013. State-of-the-art in visual attention modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 35, 1 (Jan. 2013), 185--207.
[8]
Ali Borji and Laurent Itti. 2015. Cat2000: A large scale fixation dataset for boosting saliency research. arXiv Preprint arXiv:1505.03581 (2015).
[9]
Ali Borji, Dicky N. Sihite, and Laurent Itti. 2013. Quantitative analysis of human-model agreement in visual saliency modeling: A comparative study. IEEE Transactions on Image Processing 22, 1 (Jan 2013), 55--69.
[10]
Michelle A. Borkin, Zoya Bylinskii, Nam Wook Kim, Constance May Bainbridge, Chelsea S. Yeh, Daniel Borkin, Hanspeter Pfister, and Aude Oliva. 2016. Beyond memorability: Visualization recognition and recall. IEEE Transactions on Visualization and Computer Graphics 22, 1 (Jan. 2016), 519--528.
[11]
Daniel Bruneau, M. Angela Sasse, and J. D. McCarthy. 2002. The eyes never lie: The use of eye tracking data in HCI research. In Proceedings of CHI, Vol. 2, 25.
[12]
Georg Buscher, Edward Cutrell, and Meredith Ringel Morris. 2009. What do you see when you’re surfing?: Using eye tracking to predict salient regions of web pages. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’09). ACM, New York, NY, 21--30.
[13]
Zoya Bylinskii, Michelle A. Borkin, Nam Wook Kim, Hanspeter Pfister, and Aude Oliva. 2017. Eye fixation metrics for large scale evaluation and comparison of information visualizations. In Eye Tracking and Visualization: Foundations, Techniques, and Applications. ETVIS 2015, Michael Burch, Lewis Chuang, Brian Fisher, Albrecht Schmidt, and Daniel Weiskopf (Eds.). Springer International Publishing, 235--255.
[14]
Zoya Bylinskii, Ellen M. DeGennaro, Rishi Rajalingham, Harald Ruda, Jinxia Zhang, and John K. Tsotsos. 2015. Towards the quantitative evaluation of visual attention models. Vision Research 116, Part B (2015), 258—268.
[15]
Zoya Bylinskii, Tilke Judd, Ali Borji, Laurent Itti, Frédo Durand, Aude Oliva, and Antonio Torralba. 2014. MIT Saliency Benchmark. (2014). http://saliency.mit.edu/.
[16]
Zoya Bylinskii, Tilke Judd, Aude Oliva, Antonio Torralba, and Frédo Durand. 2016. What do different evaluation metrics tell us about saliency models? CoRR abs/1604.03605 (2016). http://arxiv.org/abs/1604.03605.
[17]
Zoya Bylinskii, Nam Wook Kim, Peter O’Donovan, Sami Alsheikh, Spandan Madan, Hanspeter Pfister, Fredo Durand, Bryan Russell, and Aaron Hertzmann. 2017. Learning visual importance for graphic designs and data visualizations. In Proceedings of the 30th Annual ACM Symposium on User Interface Software 8 Technology (UIST’17). ACM.
[18]
Zoya Bylinskii, Adrià Recasens, Ali Borji, Aude Oliva, Antonio Torralba, and Frédo Durand. 2016. Where should saliency models look next? In Proceedings of the European Conference on Computer Vision. Springer, 809--824.
[19]
Mon Chu Chen, John R. Anderson, and Myeong Ho Sohn. 2001. What can a mouse cursor tell us more?: Correlation of eye/mouse movements on web browsing. In Proceedings of CHI’01 Extended Abstracts on Human Factors in Computing Systems (CHI EA’01). ACM, New York, NY, 281--282.
[20]
Laura Cowen, Linden J. Ball, and Judy Delin. 2002. An eye movement analysis of web page usability. In People and Computers XVI. Springer, 317--335.
[21]
Edward Cutrell and Zhiwei Guan. 2007. What are you looking for?: An eye-tracking study of information usage in web search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’07). ACM, New York, NY, 407--416.
[22]
Abhishek Das, Harsh Agrawal, Lawrence Zitnick, Devi Parikh, and Dhruv Batra. 2016. Human attention in visual question answering: Do humans and deep networks look at the same regions? arXiv Preprint arXiv:1606.03556 (2016).
[23]
Jia Deng, Jonathan Krause, and Li Fei-Fei. 2013. Fine-grained crowdsourcing for fine-grained recognition. In Proceedings of the 2013 IEEE Conference on Computer Vision and Pattern Recognition (CVPR’13). IEEE Computer Society, Washington, DC, 580--587.
[24]
Andrew T. Duchowski. 2002. A breadth-first survey of eye-tracking applications. Behavior Research Methods, Instruments, 8 Computers 34, 4 (2002), 455--470.
[25]
Simone Frintrop, Erich Rome, and Henrik I. Christensen. 2010. Computational visual attention systems and their cognitive foundations: A survey. ACM Transactions on Applied Perception 7, 1, Article 6 (Jan. 2010), 39 pages.
[26]
Kenneth Alberto Funes Mora, Florent Monay, and Jean-Marc Odobez. 2014. EYEDIAP: A database for the development and evaluation of gaze estimation algorithms from RGB and RGB-D cameras. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’14). ACM, New York, NY, 255--258.
[27]
Joseph H. Goldberg and Xerxes P. Kotval. 1999. Computer interface evaluation using eye movements: Methods and constructs. International Journal of Industrial Ergonomics 24, 6 (1999), 631--645.
[28]
Joseph H. Goldberg, Mark J. Stimson, Marion Lewenstein, Neil Scott, and Anna M. Wichansky. 2002. Eye tracking in web search tasks: Design implications. In Proceedings of the 2002 Symposium on Eye Tracking Research 8 Applications (ETRA’02). ACM, New York, NY, 51--58.
[29]
Frédéric Gosselin and Philippe G. Schyns. 2001. Bubbles: A technique to reveal the use of information in recognition tasks. Vision Research 41, 17 (2001), 2261--2271.
[30]
W. Graf and H. Krueger. 1989. Ergonomic evaluation of user-interfaces by means of eye-movement data. In Proceedings of the 3rd International Conference on Human-computer Interaction. Elsevier Science Inc., 659--665.
[31]
Elizabeth R. Grant and Michael J. Spivey. 2003. Eye movements and problem solving. Psychological Science 14, 5 (2003), 462--466.
[32]
Rebecca Grier, Philip Kortum, and James Miller. 2007. How users view web pages: An exploration of cognitive and perceptual mechanisms. Human Computer Interaction Research in Web Design and Evaluation (2007), 22--41.
[33]
Qi Guo and Eugene Agichtein. 2010. Towards predicting web searcher gaze position from mouse movements. In Proceedings of CHI’10 Extended Abstracts on Human Factors in Computing Systems (CHI EA’10). ACM, New York, NY, 3601--3606.
[34]
Mary Hayhoe. 2004. Advances in relating eye movements and cognition. Infancy 6, 2 (2004), 267--274.
[35]
Kenneth Holmqvist, Marcus Nyström, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost Van de Weijer. 2011. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, Oxford.
[36]
Jeff Huang, Ryen White, and Georg Buscher. 2012. User see, user point: Gaze and cursor alignment in web search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, New York, NY, 1341--1350.
[37]
Jeff Huang, Ryen W. White, and Susan Dumais. 2011. No clicks, no problem: Using cursor movements to understand and improve search. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, New York, NY, 1225--1234.
[38]
Qiong Huang, Ashok Veeraraghavan, and Ashutosh Sabharwal. 2015. TabletGaze: Unconstrained appearance-based gaze estimation in mobile tablets. arXiv Preprint arXiv:1508.01244 (2015).
[39]
Weidong Huang. 2007. Using eye tracking to investigate graph layout effects. In Proceedings of APVIS’07. 97--100.
[40]
Robert J. K. Jacob and Keith S. Karn. 2003. Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. Mind 2, 3 (2003), 4.
[41]
Anthony R. Jansen, Alan F. Blackwell, and Kim Marriott. 2003. A tool for tracking visual attention: The restricted focus viewer. Behavior Research Methods, Instruments, 8 Computers 35, 1 (2003), 57--69.
[42]
Ming Jiang, Shengsheng Huang, Juanyong Duan, and Qi Zhao. 2015. SALICON: Saliency in context. In Proceedings of the 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 1072--1080.
[43]
Sheree Josephson and Michael E. Holmes. 2002. Visual attention to repeated internet images: Testing the scanpath theory on the world wide web. In Proceedings of the 2002 Symposium on Eye Tracking Research 8 Applications (ETRA’02). ACM, New York, NY, 43--49.
[44]
Tilke Judd, Frédo Durand, and Antonio Torralba. 2012. A benchmark of computational models of saliency to predict human fixations. MIT Technical Report. MIT-CSAIL-TR-2012-001. MIT CSAIL.
[45]
Tilke Judd, Krista Ehinger, Fredo Durand, and Antonio Torralba. 2009. Learning to predict where humans look. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision. 2106--2113.
[46]
Marcel Adam Just and Patricia A. Carpenter. 1976. Eye fixations and cognitive processes. Cognitive Psychology 8, 4 (1976), 441--480.
[47]
Wolf Kienzle, Felix A. Wichmann, Matthias O. Franz, and Prof. Bernhard Schölkopf. 2007. A nonparametric approach to bottom-up visual saliency. In Advances in Neural Information Processing Systems 19, P. B. Schölkopf, J. C. Platt, and T. Hoffman (Eds.). MIT Press, 689--696.
[48]
Nam Wook Kim, Zoya Bylinskii, Michelle A. Borkin, Aude Oliva, Krzysztof Z. Gajos, and Hanspeter Pfister. 2015. A crowdsourced alternative to eye-tracking for visualization understanding. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA’15). ACM, New York, NY, 1349--1354.
[49]
Sung-Hee Kim, Zhihua Dong, Hanjun Xian, Benjavan Upatising, and Ji Soo Yi. 2012. Does an eye tracker tell the truth about visualizations?: Findings while investigating visualizations for decision making. IEEE TVCG 18, 12 (2012), 2421--2430.
[50]
Aniket Kittur, Ed H. Chi, and Bongwon Suh. 2008. Crowdsourcing user studies with mechanical turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’08). ACM, New York, NY, 453--456.
[51]
Kathryn Koehler, Fei Guo, Sheng Zhang, and Miguel P. Eckstein. 2014. What do saliency models predict? Journal of Vision 14, 3 (2014), 14. arXiv:/data/journals/jov/932817/i1534-7362-14-3-14.pdf
[52]
Eileen Kowler. 1989. The role of visual and cognitive processes in the control of eye movement. Reviews of Oculomotor Research 4 (1989), 1--70.
[53]
Kyle Krafka, Aditya Khosla, Petr Kellnhofer, Harini Kannan, Suchendra Bhandarkar, Wojciech Matusik, and Antonio Torralba. 2016. Eye tracking for everyone. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2176--2184.
[54]
Srinivas S. Kruthiventi, Kumar Ayush, and R. Venkatesh Babu. 2015. DeepFix: A fully convolutional neural network for predicting human eye fixations. CoRR abs/1510.02927 (2015). http://arxiv.org/abs/1510.02927.
[55]
Dmitry Lagun and Eugene Agichtein. 2011. ViewSer: Enabling large-scale remote user studies of web search examination and interaction. In Proceedings of the 34th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR’11). ACM, New York, NY, 365--374.
[56]
Olivier Le Meur and Thierry Baccino. 2013. Methods for comparing scanpaths and saliency maps: Strengths and weaknesses. Behavior Research Methods 45, 1 (2013), 251--266.
[57]
Daniel J. Liebling and Sören Preibusch. 2014. Privacy considerations for a pervasive eye tracking world. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp’14 Adjunct). ACM, New York, NY, 1169--1177.
[58]
Tsung-Yi Lin, Michael Maire, Serge Belongie, James Hays, Pietro Perona, Deva Ramanan, Piotr Dollár, and Lawrence Zitnick. 2014. Microsoft COCO: Common objects in context. In Proceedings of the 13th European Conference on Computer Vision -- ECCV 2014, Zurich, Switzerland, September 6--12, 2014, Part V. David Fleet, Tomas Pajdla, Bernt Schiele, and Tinne Tuytelaars (Eds.). Springer International Publishing, 740--755.
[59]
Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. mouse: A fast and accurate gaze-only click alternative. In Proceedings of the 28th Annual ACM Symposium on User Interface Software 8 Technology (UIST’15). ACM, New York, NY, 385--394.
[60]
Päivi Majaranta and Andreas Bulling. 2014. Eye Tracking and Eye-Based Human--Computer Interaction. Springer London, London, 39--65.
[61]
George W. McConkie and Keith Rayner. 1975. The span of the effective stimulus during a fixation in reading. Perception 8 Psychophysics 17, 6 (1975), 578--586.
[62]
Jakob Nielsen and Kara Pernice. 2009. Eyetracking Web Usability (1st ed.). New Riders Publishing, Thousand Oaks, CA.
[63]
David Noton and Lawrence Stark. 1971. Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vision Research 11, 9 (1971), 929--942, IN3--IN8.
[64]
Peter O’Donovan, Aseem Agarwala, and Aaron Hertzmann. 2014. Learning layouts for single-page graphic designs. IEEE Transactions on Visualization and Computer Graphics 20, 8 (Aug 2014), 1200--1213.
[65]
Peter O’Donovan, Aseem Agarwala, and Aaron Hertzmann. 2015. DesignScape: Design with interactive layout suggestions. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI’15). ACM, New York, NY, 1221--1224.
[66]
Bing Pan, Helene A. Hembrooke, Geri K. Gay, Laura A. Granka, Matthew K. Feusner, and Jill K. Newman. 2004. The determinants of web page viewing behavior: An eye-tracking study. In Proceedings of the 2004 Symposium on Eye Tracking Research 8 Applications (ETRA’04). ACM, New York, NY, 147--154.
[67]
Junting Pan, Kevin McGuinness, Elisa Sayrol, Noel E. O’Connor, and Xavier Giró i Nieto. 2016. Shallow and deep convolutional networks for saliency prediction. CoRR abs/1603.00845 (2016). http://arxiv.org/abs/1603.00845.
[68]
Alexandra Papoutsaki, Patsorn Sangkloy, James Laskey, Nediyana Daskalova, Jeff Huang, and James Hays. 2016. WebGazer: Scalable webcam eye tracking using user interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI). AAAI, 3839--3845.
[69]
Derrick Parkhurst, Klinton Law, and Ernst Niebur. 2002. Modeling the role of salience in the allocation of overt visual attention. Vision Research 42, 1 (2002), 107--123.
[70]
Mathias Pohl, Markus Schmitt, and Stephan Diehl. 2009. Comparing the readability of graph layouts using eyetracking and task-oriented analysis. In Proceedings of the 5th Eurographics Conference on Computational Aesthetics in Graphics, Visualization and Imaging (Computational Aesthetics’09). Eurographics Association, Aire-la-Ville, Switzerland, Switzerland, 49--56.
[71]
Alex Poole and Linden J. Ball. 2006. Eye tracking in HCI and usability research. Encyclopedia of Human Computer Interaction 1 (2006), 211--219.
[72]
Keith Rayner. 1998. Eye movements in reading and information processing: 20 years of research.Psychological Bulletin 124, 3 (1998), 372.
[73]
Keith Rayner. 2014. The gaze-contingent moving window in reading: Development and review. Visual Cognition 22, 3--4 (2014), 242--258.
[74]
Keith Rayner, Caren M. Rotello, Andrew J. Stewart, Jessica Keir, and Susan A. Duffy. 2001. Integrating text and pictorial information: Eye movements when looking at print advertisements.Journal of Experimental Psychology: Applied 7, 3 (2001), 219.
[75]
Eyal M. Reingold, Lester C. Loschky, George W. McConkie, and David M. Stampe. 2003. Gaze-contingent multiresolutional displays: An integrative review. Human Factors: The Journal of the Human Factors and Ergonomics Society 45, 2 (2003), 307--328.
[76]
Ronald A. Rensink. 2011. The Management of Visual Attention in Graphic Displays. Cambridge University Press, Cambridge, England.
[77]
Kerry Rodden, Xin Fu, Anne Aula, and Ian Spiro. 2008. Eye-mouse coordination patterns on web search results pages. In Proceedings of CHI’08 Extended Abstracts on Human Factors in Computing Systems (CHI EA’08). ACM, New York, NY, 2997--3002.
[78]
Ruth Rosenholtz, Amal Dorai, and Rosalind Freeman. 2011. Do predictions of visual perception aid design? ACM Trans. Appl. Percept. 8, 2, Article 12 (Feb. 2011), 12:1--12:20 pages.
[79]
Michael Schulte-Mecklenbeck, Ryan O. Murphy, and Florian Hutzler. 2011. Flashlight - Recording information acquisition online. Computers in Human Behavior 27, 5 (Sept. 2011), 1771--1782.
[80]
Chengyao Shen, Xun Huang, and Qi Zhao. 2015. Predicting eye fixations on webpage with an ensemble of early features and high-level representations from deep network. IEEE Transactions on Multimedia 17, 11 (Nov. 2015), 2084--2093.
[81]
Chengyao Shen and Qi Zhao. 2014. Webpage Saliency. Springer International Publishing, 33--46.
[82]
Peter Tarasewich, Marc Pomplun, Stephanie Fillion, and Daniel Broberg. 2005. The enhanced restricted focus viewer. International Journal of Human Computer Interaction 19, 1 (2005), 35--54.
[83]
Benjamin W. Tatler. 2007. The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. Journal of Vision 7, 14 (2007), 4. arXiv:/data/journals/jov/932846/jov-7-14-4.pdf
[84]
Benjamin W. Tatler, Roland J. Baddeley, and Iain D. Gilchrist. 2005. Visual correlates of fixation selection: Effects of scale and time. Vision Research 45, 5 (2005), 643--659.
[85]
Benjamin W. Tatler, Mary M. Hayhoe, Michael F. Land, and Dana H. Ballard. 2011. Eye guidance in natural vision: Reinterpreting salience. Journal of Vision 11, 5 (2011), 5. arXiv:/data/journals/jov/933487/jov-11-5-5.pdf
[86]
Hamed R. Tavakoli, Fawad Ahmed, Ali Borji, and Jorma Laaksonen. 2017. Saliency revisited: Analysis of mouse movements versus fixations. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
[87]
Tobii. 2010. Tobii Eye Tracking: An Introduction to Eye Tracking and Tobii Eye Trackers. White paper. Tobii Technology AB.
[88]
Luis von Ahn and Laura Dabbish. 2004. Labeling images with a computer game. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’04). ACM, New York, NY, 319--326.
[89]
Niklas Wilming, Torsten Betz, Tim C. Kietzmann, and Peter Kanig. 2011. Measures and limits of models of fixation selection. PLOS ONE 6, 9 (2011), 1--19.
[90]
Juan Xu, Ming Jiang, Shuo Wang, Mohan S. Kankanhalli, and Qi Zhao. 2014. Predicting human gaze beyond pixels. Journal of Vision 14, 1 (2014), 28. arXiv:/data/Journals/JOV/933546/i1534-7362-14-1-28.pdf
[91]
Pingmei Xu, Krista A. Ehinger, Yinda Zhang, Adam Finkelstein, Sanjeev R. Kulkarni, and Jianxiong Xiao. 2015. TurkerGaze: Crowdsourcing saliency with webcam based eye tracking. CoRR abs/1504.06755 (2015). http://arxiv.org/abs/1504.06755.
[92]
Pingmei Xu, Yusuke Sugano, and Andreas Bulling. 2016. Spatio-temporal modeling and prediction of visual attention in graphical user interfaces. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI’16). ACM, New York, NY, 3299--3310.
[93]
Xucong Zhang, Yusuke Sugano, Mario Fritz, and Andreas Bulling. 2015. Appearance-based gaze estimation in the wild. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 4511--4520.

Cited By

View all
  • (2024)GazeFusion: Saliency-Guided Image GenerationACM Transactions on Applied Perception10.1145/369496921:4(1-19)Online publication date: 6-Sep-2024
  • (2024)Saliency3D: A 3D Saliency Dataset Collected on ScreenProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653350(1-6)Online publication date: 4-Jun-2024
  • (2024)SalChartQA: Question-driven Saliency on Information VisualisationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642942(1-14)Online publication date: 11-May-2024
  • Show More Cited By

Index Terms

  1. BubbleView: An Interface for Crowdsourcing Image Importance Maps and Tracking Visual Attention

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Computer-Human Interaction
    ACM Transactions on Computer-Human Interaction  Volume 24, Issue 5
    October 2017
    167 pages
    ISSN:1073-0516
    EISSN:1557-7325
    DOI:10.1145/3149825
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 13 November 2017
    Accepted: 01 July 2017
    Revised: 01 June 2017
    Received: 01 February 2017
    Published in TOCHI Volume 24, Issue 5

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Human vision
    2. crowdsourcing
    3. eye tracking
    4. graphic designs
    5. image importance
    6. information visualizations
    7. mouse-contingent interface
    8. natural scenes
    9. saliency
    10. visual attention
    11. websites

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    • Toyota Research Institute/MIT CSAIL Joint Research Center
    • Google, Xerox, the NSF Graduate Research Fellowship Program
    • Kwanjeong Educational Foundation
    • Natural Sciences and Engineering Research Council of Canada

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)64
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 13 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)GazeFusion: Saliency-Guided Image GenerationACM Transactions on Applied Perception10.1145/369496921:4(1-19)Online publication date: 6-Sep-2024
    • (2024)Saliency3D: A 3D Saliency Dataset Collected on ScreenProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653350(1-6)Online publication date: 4-Jun-2024
    • (2024)SalChartQA: Question-driven Saliency on Information VisualisationsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642942(1-14)Online publication date: 11-May-2024
    • (2024)Do You See What I See? A Qualitative Study Eliciting High-Level Visualization ComprehensionProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642813(1-26)Online publication date: 11-May-2024
    • (2024)Discursive Patinas: Anchoring Discussions in Data VisualizationsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345633431:1(1246-1256)Online publication date: 13-Sep-2024
    • (2024)What Does the Chart Say? Grouping Cues Guide Viewer Comparisons and Conclusions in Bar ChartsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.328929230:8(5097-5110)Online publication date: 1-Aug-2024
    • (2024)3D Pop-Ups: Omnidirectional image visual saliency prediction based on crowdsourced eye-tracking data in VRDisplays10.1016/j.displa.2024.10274683(102746)Online publication date: Jul-2024
    • (2024)Neurophysiological Measurements in the Research Field of Interruption Science: Insights into Applied Methods for Different Interruption Types Based on an Umbrella ReviewInformation Systems and Neuroscience10.1007/978-3-031-58396-4_11(123-152)Online publication date: 26-Jul-2024
    • (2024)Algorithm for Mapping Layout Grids in User Interfaces: Automating the “Squint Test”Tools and Methods of Program Analysis10.1007/978-3-031-50423-5_1(3-14)Online publication date: 3-Jan-2024
    • (2024)HeatmapsProceedings of the Association for Information Science and Technology10.1002/pra2.117361:1(1021-1023)Online publication date: 15-Oct-2024
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media