Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3204493.3204546acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article
Open access

Leveraging eye-gaze and time-series features to predict user interests and build a recommendation model for visual analysis

Published: 14 June 2018 Publication History

Abstract

We developed a new concept to improve the efficiency of visual analysis through visual recommendations. It uses a novel eye-gaze based recommendation model that aids users in identifying interesting time-series patterns. Our model combines time-series features and eye-gaze interests, captured via an eye-tracker. Mouse selections are also considered. The system provides an overlay visualization with recommended patterns, and an eye-history graph, that supports the users in the data exploration process. We conducted an experiment with 5 tasks where 30 participants explored sensor data of a wind turbine. This work presents results on pre-attentive features, and discusses the precision/recall of our model in comparison to final selections made by users. Our model helps users to efficiently identify interesting time-series patterns.

References

[1]
Gediminas Adomavicius and Alexander Tuzhilin. 2005. Toward the Next Generation of Recommender Systems: A Survey of the State-of-the-Art and Possible Extensions. IEEE Trans. on Knowl. and Data Eng. 17, 6 (June 2005), 734--749.
[2]
V. Alvarez-Cortes, B. E. Zayas-Perez, V. H. Zarate-Silva, and J. A. Ramirez Uresti. 2007. Current Trends in Adaptive User Interfaces: Challenges and Applications. In Electronics, Robotics and Automotive Mechanics Conference (CERMA 2007). 312--317.
[3]
Gennady Andrienko, Natalia Andrienko, Michael Burch, and Daniel Weiskopf. 2012. Visual Analytics Methodology for Eye Movement Studies. IEEE Transactions on Visualization and Computer Graphics 18, 12 (Dec. 2012), 2889--2898.
[4]
Roman Bednarik. 2012. Expertise-dependent Visual Attention Strategies Develop over Time During Debugging with Multiple Code Representations. Int. J. Hum.-Comput. Stud. 70, 2 (Feb. 2012), 13.
[5]
T. Blascheck, M. John, K. Kurzhals, S. Koch, and T. Ertl. 2016. VA2: A Visual Analytics Approach for Evaluating Visual Analytics Applications. IEEE Transactions on Visualization and Computer Graphics 22, 1 (Jan 2016), 61--70.
[6]
T. Blascheck, K. Kurzhals, M. Raschke, M. Burch, D. Weiskopf, and T. Ertl. 2014. State-of-the-Art of Visualization for Eye Tracking Data. In EuroVis - STARs, R. Borgo, R. Maciejewski, and I. Viola (Eds.). The Eurographics Association.
[7]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Michael Burch, Daniel Weiskopf, and Thomas Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey. Computer Graphics Forum (2017).
[8]
Ali Borji, Andreas Lennartz, and Marc Pomplun. 2015. What do eyes reveal about the mind?: Algorithmic inference of search targets from fixations. Neurocomputing 149 (2015), 788 -- 799.
[9]
John Brooke. 1996. SUS: A quick and dirty usability scale. (1996).
[10]
C. Cassisi, P. Montalto, M. Aliotta, A. Cannata, and A. Pulvirenti. 2012. Similarity Measures and Dimensionality Reduction Techniques for Time Series Data Mining. (2012). http://hdl.handle.net/2122/8082
[11]
Charles E. Connor, Howard E. Egeth, and Steven Yantis. 2004. Visual Attention: Bottom-Up Versus Top-Down. Current Biology 14, 19 (2004), R850 -- R852.
[12]
Stéphane Conversy, Christophe Hurter, and Stephane Chatty. 2010. A Descriptive Model of Visual Scanning. In Proceedings of the 3rd BELIV'10 Workshop: BEyond Time and Errors: Novel evaLuation Methods for Information Visualization (BELIV '10). ACM, New York, NY, USA, 35--42.
[13]
Andrew T. Duchowski. 2003. Eye Tracking Methodology: Theory and Practice. Springer-Verlag New York, Inc., Secaucus, NJ, USA.
[14]
E. Eggeling, V. Settgast, N. Silva, M. Poiger, T. Zeh, and D. Fellner. 2015. The Sixth Sense of an Air Traffic Controller. In SID 2015. SesarJU. https://goo.gl/nDcrG8
[15]
R. Etemadpour, B. Olk, and L. Linsen. 2014. Eye-tracking investigation during visual analysis of projected multidimensional data with 2D scatterplots. In IVAPP 2014. 233--246.
[16]
Susan Farnand, Preethi Vaidyanathan, and Jeff Pelz. 2016. Recurrence Metrics for Assessing Eye Movements in Perceptual Experiments. Journal of Eye Movement Research 9, 4 (2016). https://bop.unibe.ch/index.php/JEMR/article/view/2539
[17]
F. Galgani, Y. Sun, P. L. Lanzi, and J. Leigh. 2009. Automatic analysis of eye tracking data for medical diagnosis. In 2009 IEEE Symposium on Computational Intelligence and Data Mining. 195--202.
[18]
Fabian Göbel, Ioannis Giannopoulos, and Martin Raubal. 2016. The Importance of Visual Attention for Adaptive Interfaces. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct (MobileHCI '16). ACM, New York, NY, USA, 930--935.
[19]
Michelle R. Greene, Tommy Liu, and Jeremy M. Wolfe. 2012. Reconsidering Yarbus: A failure to predict observersâĂŹ task from eye movement patterns. Vision Research 62 (2012), 1 -- 8.
[20]
Amin Haji-Abolhassani and James J. Clark. 2014. An inverse Yarbus process: Predicting observers' task from eye movement patterns. Vision Research 103 (2014), 127 -- 142.
[21]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Human mental workload 1, 3 (1988), 139--183.
[22]
Jerónimo Hernández-González, Daniel Rodríguez, Iñaki Inza, Rachel Harrison, and José Antonio Lozano. 2018. Learning to classify software defects from crowds: A novel approach. Appl. Soft Comput. 62 (2018), 579--591.
[23]
Kenneth Holmqvist, Marcus Nystrom, Richard Andersson, Richard Dewhurst, Halszka Jarodzka, and Joost van de Weijer. 2011. Eye Tracking. A comprehensive guide to methods and measures. Oxford University Press. https://goo.gl/Gdaqsx
[24]
Mike Izbiki. 2011. Converting images into time series for datamining. (2011). https://goo.gl/Q5ht4U Accessed: 2018-03-08.
[25]
R. J. K. Jacob and K. S. Karn. 2003. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. The Mind's eye: Cognitive The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research (2003), 573--603. http://www.sciencedirect.com.lt.ltag.bibl.liu.se/science
[26]
Kuno Kurzhals, Brian D. Fisher, Michael Burch, and Daniel Weiskopf. 2014. Evaluating visual analytics with eye tracking. In BELIV.
[27]
Jessica Lin, Eamonn Keogh, Stefano Lonardi, and Bill Chiu. 2003. A Symbolic Representation of Time Series, with Implications for Streaming Algorithms. In Proceedings of the 8th ACM SIGMOD Workshop on Research Issues in Data Mining and Knowledge Discovery (DMKD '03). ACM, New York, NY, USA, 2--11.
[28]
Y. Lu, W. Zhang, C. Jin, and X. Xue. 2012. Learning attention map from images. In 2012 IEEE Conference on Computer Vision and Pattern Recognition. 1067--1074.
[29]
Marion Morel, Catherine Achard, Richard Kulpa, and Séverine Dubuisson. 2018. Time-series averaging using constrained dynamic time warping with tolerance. Pattern Recognition 74 (2018), 77--89.
[30]
Alex Poole and Linden J. Ball. 2005. Eye Tracking in HCI and Usability Research: Current Status and Future. In ProspectsâĂİ, Chapter in C. Ghaoui (Ed.): Encyclopedia of Human-Computer Interaction. Pennsylvania: Idea Group, Inc.
[31]
Zahar Prasov, Joyce Yue Chai, and Hogyeong Jeong. 2007. Eye Gaze for Attention Prediction in Multimodal Human-Machine Conversation. In Interaction Challenges for Intelligent Assistants, Papers from the 2007 AAAI Spring Symposium, 26-28 March 2007, Stanford University, CA, USA. 102--110.
[32]
G. Ristovski, M. Hunter, B. Olk, and L. Linsen. 2013. EyeC:Coordinated Views for Interactive Visual Exploration of Eye-Tracking Data. In International Conference on Information Visualisation. 239--248.
[33]
Lin Shao, Nelson Silva, Eva Eggeling, and Tobias Schreck. 2017. Visual exploration of large scatter plot matrices by pattern recommendation based on eye tracking. In Proceedings of the 2017 ACM Workshop on Exploratory Search and Interactive Data Analytics. ACM, 9--16.
[34]
Hiroshi Shimodaira. 2014. Similarity and recommender systems. (2014).
[35]
Nelson Silva, Volker Settgast, Eva Eggeling, Florian Grill, Theodor Zeh, and Dieter Fellner. 2014. Sixth sense-air traffic control prediction scenario augmented by sensors. In i-know 2014. ACM, 34.
[36]
Nelson Silva, Lin Shao, Tobias Schreck, Eva Eggeling, and Dieter Fellner. 2016a. Visual Exploration of Hierarchical Data Using Degree-of-Interest Controlled by Eye-Tracking. In FMT 2016. https://goo.gl/Qp6k98
[37]
Nelson Silva, Lin Shao, T Schreck, Eva Eggeling, and Dieter W Fellner. 2016b. Sense.me - Open Source Framework for the Exploration and Visualization of Eye Tracking Data. In Proceedings of IEEE Conference on Information Visualization. Institute of Electrical and Electronics Engineers. https://goo.gl/NvnRZ8
[38]
Popelka Stanislav, Stachon Zdenek, Sasinka Cenek, and Dolezalova Jitka. 2016. Eye-Tribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes. (2016), 14 pages.
[39]
Adrian Voßkühler, Volkhard Nordmeier, Lars Kuchinke, and Arthur M. Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Behavior Research Methods 40, 4 (01 Nov 2008), 1150--1162.
[40]
Yining Wang, Liwei Wang, Yuanzhi Li, Di He, Tie-Yan Liu, and Wei Chen. 2013. A Theoretical Analysis of NDCG Type Ranking Measures. CoRR (2013).

Cited By

View all
  • (2024)Predictive Gaze Analytics: A Comparative Case Study of the Foretelling Signs of User Performance during Interaction with Visualizations of Ontology Class HierarchiesMultimodal Technologies and Interaction10.3390/mti81000908:10(90)Online publication date: 12-Oct-2024
  • (2023)iBall: Augmenting Basketball Videos with Gaze-moderated Embedded VisualizationsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581266(1-18)Online publication date: 19-Apr-2023
  • (2023)A Design Space for Surfacing Content Recommendations in Visual Analytic PlatformsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320944529:1(84-94)Online publication date: Jan-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
June 2018
595 pages
ISBN:9781450357067
DOI:10.1145/3204493
This work is licensed under a Creative Commons Attribution International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. evaluation
  2. eye-tracking
  3. model
  4. recommend
  5. similarity
  6. time-series
  7. visual analytics

Qualifiers

  • Research-article

Funding Sources

  • TU Graz Open Access Publishing Fund

Conference

ETRA '18

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)264
  • Downloads (Last 6 weeks)13
Reflects downloads up to 18 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Predictive Gaze Analytics: A Comparative Case Study of the Foretelling Signs of User Performance during Interaction with Visualizations of Ontology Class HierarchiesMultimodal Technologies and Interaction10.3390/mti81000908:10(90)Online publication date: 12-Oct-2024
  • (2023)iBall: Augmenting Basketball Videos with Gaze-moderated Embedded VisualizationsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581266(1-18)Online publication date: 19-Apr-2023
  • (2023)A Design Space for Surfacing Content Recommendations in Visual Analytic PlatformsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2022.320944529:1(84-94)Online publication date: Jan-2023
  • (2023)Continuous Gaze Tracking With Implicit Saliency-Aware Calibration on Mobile DevicesIEEE Transactions on Mobile Computing10.1109/TMC.2022.318513422:10(5816-5828)Online publication date: 1-Oct-2023
  • (2023)Visual Data Science for Industrial ApplicationsDigital Transformation10.1007/978-3-662-65004-2_18(447-471)Online publication date: 3-Feb-2023
  • (2022)Impending Success or Failure? An Investigation of Gaze-Based User Predictions During Interaction with Ontology VisualizationsProceedings of the 2022 International Conference on Advanced Visual Interfaces10.1145/3531073.3531081(1-9)Online publication date: 6-Jun-2022
  • (2022)Eye Gaze on Scatterplot: Concept and First Results of Recommendations for Exploration of SPLOMs Using Implicit Data Selection2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3531165(1-7)Online publication date: 8-Jun-2022
  • (2022)Identifying Distractors for People with Computer Anxiety Based on Mouse FixationsInteracting with Computers10.1093/iwc/iwac02535:2(165-190)Online publication date: 11-Oct-2022
  • (2021)Gaze Interaction and People with Computer AnxietyProceedings of the XX Brazilian Symposium on Human Factors in Computing Systems10.1145/3472301.3484319(1-12)Online publication date: 18-Oct-2021
  • (2021)MemXProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34635095:2(1-23)Online publication date: 24-Jun-2021
  • Show More Cited By

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media