Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

UsyBus: A Communication Framework among Reusable Agents integrating Eye-Tracking in Interactive Applications

Published: 17 June 2022 Publication History

Abstract

Eye movement analysis is a popular method to evaluate whether a user interface meets the users' requirements and abilities. However, with current tools, setting up a usability evaluation with an eye-tracker is resource-consuming, since the areas of interest are defined manually, exhaustively and redefined each time the user interface changes. This process is also error-prone, since eye movement data must be finely synchronised with user interface changes. These issues become more serious when the user interface layout changes dynamically in response to user actions. In addition, current tools do not allow easy integration into interactive applications, and opportunistic code must be written to link these tools to user interfaces. To address these shortcomings and to leverage the capabilities of eye-tracking, we present UsyBus, a communication framework for autonomous, tight coupling among reusable agents. These agents are responsible for collecting data from eye-trackers, analyzing eye movements, and managing communication with other modules of an interactive application. UsyBus allows multiple heterogeneous eye-trackers as input, provides multiple configurable outputs depending on the data to be exploited. Modules exchange data based on the UsyBus communication framework, thus creating a customizable multi-agent architecture. UsyBus application domains range from usability evaluation to gaze interaction applications design. Two case studies, composed of reusable modules from our portfolio, exemplify the implementation of the UsyBus framework.

Supplementary Material

MP4 File (v6eics157.mp4)
Supplemental video

References

[1]
Nahla J. Abid, Jonathan I. Maletic, and Bonita Sharif. 2019. Using Developer Eye Movements to Externalize the Mental Model Used in Code Summarization Tasks. In ACM Symposium on Eye Tracking Research & Applications (ETRA'19). ACM, New York, NY, USA, Article 13, 9 pages. https://doi.org/10.1145/3314111.3319834
[2]
Nathalie Aquino, Jean Vanderdonckt, and Oscar Pastor. 2010. Transformation Templates: Adding Flexibility to Model-Driven Engineering of User Interfaces. In Proceedings of the 2010 ACM Symposium on Applied Computing (Sierre, Switzerland) (SAC '10). Association for Computing Machinery, New York, NY, USA, 1195--1202. https://doi.org/10.1145/1774088.1774340
[3]
Onur Asan and Yushi Yang. 2015. Using Eye Trackers for Usability Evaluation of Health Information Technology: A Systematic Literature Review. JMIR Human Factors, Vol. 2, 1 (14 Apr 2015), e5. https://doi.org/10.2196/humanfactors.4062
[4]
Roman Bednarik and Markku Tukiainen. 2006. An Eye-tracking Methodology for Characterizing Program Comprehension Processes. In ACM Symposium on Eye Tracking Research & Applications (ETRA'06). ACM, New York, NY, USA, 125--132. https://doi.org/10.1145/1117309.1117356
[5]
Lars Torsten Berger, Andreas Schwager, Pascal Pagani, and Daniel M. Schneider. 2015. MIMO Power Line Communications. IEEE Commun. Surv. Tutorials, Vol. 17, 1 (2015), 106--124. https://doi.org/10.1109/COMST.2014.2339276
[6]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, M. Burch, Daniel Weiskopf, and Thomas Ertl. 2014. State-of-the-Art of Visualization for Eye Tracking Data. In EuroVis - STARs. The Eurographics Association. https://doi.org/10.2312/eurovisstar.20141173
[7]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, M. Burch, Daniel Weiskopf, and Thomas Ertl. 2017. Visualization of Eye Tracking Data: A Taxonomy and Survey. Computer Graphics Forum, Vol. 36, 8 (2017), 260--284. https://doi.org/10.1111/cgf.13079
[8]
Tanja Blascheck, Kuno Kurzhals, Michael Raschke, Stefan Strohmaier, Daniel Weiskopf, and Thomas Ertl. 2016. AOI Hierarchies for Visual Exploration of Fixation Sequences. In ACM Symposium on Eye Tracking Research & Applications (ETRA'16). ACM, New York, NY, USA, 111--118. https://doi.org/10.1145/2857491.2857524
[9]
Francc ois Bodart, Anne-Marie Hennebert, Jean-Marie Leheureux, Isabelle Provot, Jean Vanderdonckt, and Giovanni Zucchinetti. 1996. Key Activities for a Development Methodology of Interactive Applications. Springer London, London, 109--134. https://doi.org/10.1007/978--1--4471--1001--9_7
[10]
Agnieszka Bojko. 2006. Using Eye Tracking to Compare Web Page Designs: A Case Study. Journal of Usability Studies, Vol. 3, 1 (May 2006), 112--120. https://uxpajournal.org/wp-content/uploads/pdf/JUS_Bojko_May2006.pdf
[11]
Sarah Bouzit, Gaëlle Calvary, Joëlle Coutaz, Denis Chêne, Eric Petit, and Jean Vanderdonckt. 2017. The PDA-LPA design space for user interface adaptation. In 11th International Conference on Research Challenges in Information Science (RCIS). 353--364. https://doi.org/10.1109/RCIS.2017.7956559
[12]
Marcellin Buisson, Alexandre Bustico, Stéphane Chatty, Francois-Régis Colin, Yannick Jestin, Sébastien Maury, Christophe Mertz, and Philippe Truillet. 2002. Ivy: Un bus logiciel au service du développement de prototypes de systèmes interactifs. In Conférence sur l'Interaction Homme-Machine (IHM'02). ACM, New York, NY, USA, 223--226. https://doi.org/10.1145/777005.777040
[13]
Teresa Busjahn, Roman Bednarik, and Carsten Schulte. 2014. What Influences Dwell Time During Source Code Reading?: Analysis of Element Type and Frequency As Factors. In ACM Symposium on Eye Tracking Research & Applications (ETRA'14). ACM, New York, NY, USA, 335--338. https://doi.org/10.1145/2578153.2578211
[14]
Shiwei Cheng and Anind K. Dey. 2019. I see, you design: user interface intelligent design system with eye tracking and interactive genetic algorithm. CCF Trans. Pervasive Comput. Interact., Vol. 1, 3 (2019), 224--236. https://doi.org/10.1007/s42486-019-00019-w
[15]
Edward Cutrell and Zhiwei Guan. 2007. What Are You Looking for?: An Eye-tracking Study of Information Usage in Web Search. In SIGCHI Conference on Human Factors in Computing Systems (CHI'07). ACM, New York, NY, USA, 407--416. https://doi.org/10.1145/1240624.1240690
[16]
Edwin S. Dalmaijer, Sebastiaan Mathôt, and Stefan Van der Stigchel. 2014. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods, Vol. 49, 6 (2014), 913--921. https://doi.org/10.3758/s13428-013-0422--2
[17]
Soussan Djamasbi. 2014. Eye Tracking and Web Experience. AIS Transactions on Human-Computer Interaction, Vol. 6, 2 (2014), 37--54. https://aisel.aisnet.org/thci/vol6/iss2/2
[18]
Andrew T. Duchowski. 2007. Eye Tracking Methodology: Theory and Practice .Springer-Verlag, Berlin, Heidelberg.
[19]
Anna Maria Feit, Lukas Vordemann, Seonwook Park, Caterina Berube, and Otmar Hilliges. 2020. Detecting Relevance during Decision-Making from Eye Movements for UI Adaptation. In ACM Symposium on Eye Tracking Research & Applications (ETRA'20). ACM, New York, NY, USA, Article 10, 11 pages. https://doi.org/10.1145/3379155.3391321
[20]
Upamanyu Ghose, Arvind A. Srinivasan, W. Paul Boyce, Hong Xu, and Eng Siong Chng. 2020. PyTrack: An end-to-end analysis toolkit for eye tracking. Behavior Research Methods, Vol. 52 (2020), 2588--2603. https://doi.org/10.3758/s13428-020-01392--6
[21]
Joseph H. Goldberg and Xerxes P. Kotval. 1999. Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics, Vol. 24, 6 (1999), 631--645. https://doi.org/10.1016/S0169--8141(98)00068--7
[22]
Gilleanes Thorwald Araujo Guedes and Rosa Maria Vicari. 2009. Applying AUML and UML 2 in the Multi-agent Systems Project. In Advances in Conceptual Modeling - Challenging Perspectives. Springer Berlin Heidelberg, Berlin, Heidelberg, 106--115.
[23]
Prateek Hejmady and N. Hari Narayanan. 2012. Visual Attention Patterns During Program Debugging with an IDE. In ACM Symposium on Eye Tracking Research & Applications (ETRA'12). ACM, New York, NY, USA, 197--200. https://doi.org/10.1145/2168556.2168592
[24]
Tim Holmes and Johannes M. Zanker. 2012. Using an Oculomotor Signature as an Indicator of Aesthetic Preference. i-Perception, Vol. 3, 7 (2012), 426--439. https://doi.org/10.1068/i0448aap
[25]
ISO. 2019. ISO/IEC 25010 - Software Quality Product Standard. standard. International Standard Organization, Geneva. https://iso25000.com/index.php/en/iso-25000-standards/iso-25010?limit=3&limitstart=0
[26]
Mirjana Ivanović, Aleksandra Klavsnja-Miliçević, Jovana Ivković, and Marco Porta. 2017. Integration of Eye Tracking Technologies and Methods in an E-Learning System. In 8th Balkan Conference in Informatics (BCI'17). ACM, New York, NY, USA, Article 29, 4 pages. https://doi.org/10.1145/3136273.3136278
[27]
Robert J.K. Jacob and Keith S. Karn. 2003. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In The Mind's Eye. North-Holland, Amsterdam, 573 -- 605. https://doi.org/10.1016/B978-044451020--4/50031--1
[28]
Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-Based Interaction Techniques. In SIGCHI Conference on Human Factors in Computing Systems (CHI'90). ACM, New York, NY, USA, 11--18. https://doi.org/10.1145/97243.97246
[29]
Clark William Jacobsohn and William August Hartman. 2017. EyeSite: A Framework for Browser-Based Eye Tracking Studies. Master's thesis. Worcester Polytechnic Institute, Worcester, Massachusetts, USA. https://digitalcommons.wpi.edu/mqp-all/2432/
[30]
Francis Jambon. 2013. Analyse oculométrique "on-line" de prémices de fixations. In 7ème coloque de psychologie ergonomique (EPIQUE 2013). Arpège Science Publishing, Bruxelles, 309--312. https://www.arpege-recherche.org/activites/colloques-epique/7e-colloque-epique
[31]
Francis Jambon, Kévin Chappellet, and Gaëlle Calvary. 2013. Plasticité des Interfaces par Perception de l'Interaction Homme-Machine: illustration en oculométrie. 25ème conférence francophone sur l'Interaction Homme-Machine (IHM'13). https://hal.inria.fr/hal-00879529
[32]
Francis Jambon and Vanda Luengo. 2012. Analyse oculométrique "on-line" avec zones d'intérêt dynamiques: application aux environnements d'apprentissage sur simulateur. In Conférence sur Ergonomie et l'Interaction Homme-Machine (Ergo'IHM 2012). Biarritz, France. https://doi.org/10.1145/2652574.2653401
[33]
Victor López-Jaquero, Jean Vanderdonckt, Francisco Montero Simarro, and Pascual González. 2007. Towards an Extended Model of User Interface Adaptation: The Isatine Framework. In Proceedings of EIS 2007 Joint Working Conference on Engineering Interactive Systems, EHCI 2007-DSV-IS 2007-HCSE 2007. (Salamanca, Spain) (Lecture Notes in Computer Science, Vol. 4940), Jan Gulliksen, Morten Borup Harning, Philippe A. Palanque, Gerrit C. van der Veer, and Janet Wesson (Eds.). Springer, 374--392. https://doi.org/10.1007/978--3--540--92698--6_23
[34]
Patricia A. Carpenter Marcel Adam Just. 1976. The role of eye-fixation research in cognitive psychology. Behavior Research Methods & Instrumentation, Vol. 8 (1976), 139--143. https://doi.org/10.3758/BF03201761
[35]
Sandra Peery Marshall. 2008. Mental Alertness Level Determination. https://portal.unifiedpatents.com/patents/patent/US-7344251-B2 US Patent 7,344,251-B2.
[36]
Unaizah Obaidellah, Michael Raschke, and Tanja Blascheck. 2019. Classification of Strategies for Solving Programming Problems Using AoI Sequence Analysis. In ACM Symposium on Eye Tracking Research & Applications (ETRA'19). ACM, New York, NY, USA, Article 15, 9 pages. https://doi.org/10.1145/3314111.3319825
[37]
Dario D. Salvucci and Joseph H. Goldberg. 2000. Identifying Fixations and Saccades in Eye-Tracking Protocols. In ACM Symposium on Eye Tracking Research & Applications (ETRA'00). ACM, New York, NY, USA, 71--78. https://doi.org/10.1145/355017.355028
[38]
Timothy R. Shaffer, Jenna L. Wise, Braden M. Walters, Sebastian C. Müller, Michael Falcone, and Bonita Sharif. 2015. iTrace: Enabling Eye Tracking on Software Artifacts Within the IDE to Support Software Engineering Tasks. In 10th Joint Meeting on Foundations of Software Engineering (ESEC/FSE 2015). ACM, New York, NY, USA, 954--957. https://doi.org/10.1145/2786805.2803188
[39]
Bonita Sharif, Michael Falcone, and Jonathan I. Maletic. 2012. An Eye-tracking Study on the Role of Scan Time in Finding Source Code Defects. In ACM Symposium on Eye Tracking Research & Applications (ETRA'12). ACM, New York, NY, USA, 381--384. https://doi.org/10.1145/2168556.2168642
[40]
Vaynee Sungeelee, Francis Jambon, and Philippe Mulhem. 2020. Proof of Concept and Evaluation of Eye Gaze Enhanced Relevance Feedback in Ecological Context. In First Joint Conference of the Information Retrieval Communities in Europe (CIRCLE 2020) (CEUR Workshop Proceedings, Vol. 2621). http://ceur-ws.org/Vol-2621/CIRCLE20_04.pdf
[41]
Jean Vanderdonckt, Mathieu Zen, and Radu-Daniel Vatavu. 2019. AB4Web: An On-Line A/B Tester for Comparing User Interface Design Alternatives. Proc. ACM Hum.-Comput. Interact., Vol. 3, EICS, Article 18 (jun 2019), 28 pages. https://doi.org/10.1145/3331160
[42]
Sandeep Vidyapu, Vijaya Saradhi Vedula, and Samit Bhattacharya. 2019. Quantitative Visual Attention Prediction on Webpage Images Using Multiclass SVM. In ACM Symposium on Eye Tracking Research & Applications (ETRA'19). ACM, New York, NY, USA, Article 90, 9 pages. https://doi.org/10.1145/3317960.3321614
[43]
Adrian Voßkühler, Volkhard Nordmeier, Lars Kuchinke, and Arthur M. Jacobs. 2008. OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Behavior Research Methods, Vol. 40, 6 (November 2008), 1150--1162. https://doi.org/10.3758/BRM.40.4.1150
[44]
Michael Wooldridge. 2009. An Introduction to Multi-Agent Systems, 2nd Edition. John Wiley & Sons, New York. https://www.wiley.com/en-us/An+Introduction+to+MultiAgent+Systems+2nd+Edition-p-9780470519462
[45]
Michael Wooldridge and Nicholas R. Jennings. 1995. Intelligent agents: theory and practice. The Knowledge Engineering Review, Vol. 10, 2 (1995), 115--152. https://doi.org/10.1017/S0269888900008122

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 6, Issue EICS
EICS
June 2022
736 pages
EISSN:2573-0142
DOI:10.1145/3544787
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 June 2022
Published in PACMHCI Volume 6, Issue EICS

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. communication data bus
  2. eye movement analysis
  3. eye-tracker
  4. eye-tracking
  5. oculometry
  6. usability engineering

Qualifiers

  • Research-article

Funding Sources

  • Laboratoire d?Informatique de Grenoble (LIG)
  • Wallonie Bruxelles International (WBI)
  • Agence Nationale de la Recherche (ANR)

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 93
    Total Downloads
  • Downloads (Last 12 months)22
  • Downloads (Last 6 weeks)1
Reflects downloads up to 22 Nov 2024

Other Metrics

Citations

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media