Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2578153.2578215acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Collaborative eye tracking for image analysis

Published: 26 March 2014 Publication History

Abstract

We present a framework for collaborative image analysis where gaze information is shared across all users. A server gathers and broadcasts fixation data from/to all clients and the clients visualize this information. Several visualization options are provided. The system can run in real-time or gaze information can be recorded and shared the next time an image is accessed. Our framework is scalable to large numbers of clients with different eye tracking devices. To evaluate our system we used it within the context of a spot-the-differences game. Subjects were presented with 10 image pairs each containing 5 differences. They were given one minute to detect the differences in each image. Our study was divided into three sessions. In session 1, subjects completed the task individually, in session 2, pairs of subjects completed the task without gaze sharing, and in session 3, pairs of subjects completed the task with gaze sharing. We measured accuracy, time-to-completion and visual coverage over each image to evaluate the performance of subjects in each session. We found that visualizing shared gaze information by graying out previously scrutinized regions of an image significantly increases the dwell time in the areas of the images that are relevant to the task (i.e. the regions where differences actually occurred). Furthermore, accuracy and time-to-completion also improved over collaboration without gaze sharing though the effects were not significant. Our framework is useful for a wide range of image analysis applications which can benefit from a collaborative approach.

References

[1]
Bailey, R., McNamara, A., Sudarsanam, N., and Grimm, C. 2009. Subtle gaze direction. ACM Trans. Graph. 28, 4 (Sept.), 100:1--100:14.
[2]
Henderson, J. M., and Hollingworth, A. 1998. Eye movements during scene viewing: An overview. In Eye Guidance in Reading and Scene Perception, G. Underwood, Ed. Oxford: Elsevier., 269--293.
[3]
Krupinski, E. A. 1996. Visual scanning patterns of radiologists searching mammograms. Acad Radiol 3, 2, 137--144.
[4]
Kundel, H. L., Nodine, C. F., Krupinski, E. A., and Mello-Thoms, C. 2008. Using gaze-tracking data and mixture distribution analysis to support a holistic model for the detection of cancers on mammograms. Academic Radiology 15, 7, 881--886.
[5]
Mello-Thoms, C., Britton, C., Abrams, G., Hakim, C., Shah, R., Hardesty, L., Maitz, G., and Gur, D. 2006. Head-mounted versus remote eye tracking of radiologists searching for breast cancer: a comparison. Acad Radiol 13, 2, 203--209.
[6]
O'Neill, E. C., Kong, Y. X. G., Connell, P. P., Ong, D. N., Haymes, S. A., Coote, M. A., and Crowston, J. G. 2011. Gaze behavior among experts and trainees during optic disc examination: does how we look affect what we see? Invest Ophthalmol Vis Sci 52, 7 (Jun), 3976--83.
[7]
Sadasivan, S., Greenstein, J. S., Gramopadhye, A. K., and Duchowski, A. T. 2005. Use of eye movements as feed-forward training for a synthetic aircraft inspection task. In Proceedings of the SIGCHI conference on Human factors in computing systems, ACM, New York, NY, USA, CHI '05, 141--149.
[8]
Sanuk Games, 2012. Spot the differences. http://spot-the-differences.com/.
[9]
Sowden, P. T., Davies, I. R., and Roling, P. 2000. Perceptual learning of the detection of features in x-ray images: a functional role for improvements in adults' visual sensitivity? J Exp Psychol Hum Percept Perform 26, 1 (Feb), 379--90.
[10]
Sridharan, S., Bailey, R., McNamara, A., and Grimm, C. 2012. Subtle gaze manipulation for improved mammography training. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, ETRA '12, 75--82.
[11]
Wellner, P., and Freeman, S. 1993. The doubledigitaldesk: Shared editing of paper documents. Tech. Rep. Tech. Rep. EPC-93-108, Xerox EuroPARC.
[12]
Yarbus, A. L. 1967. Eye Movements and Vision. Plenum. New York.

Cited By

View all
  • (2021)Multi-Sensor Eye-Tracking Systems and Tools for Capturing Student Attention and Understanding Engagement in Learning: A ReviewIEEE Sensors Journal10.1109/JSEN.2021.310570621:20(22402-22413)Online publication date: 15-Oct-2021
  • (2021)Shared Gaze Visualizations in Collaborative Interactions: Past, Present and FutureInteracting with Computers10.1093/iwcomp/iwab01533:2(115-133)Online publication date: 18-May-2021

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '14: Proceedings of the Symposium on Eye Tracking Research and Applications
March 2014
394 pages
ISBN:9781450327510
DOI:10.1145/2578153
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 March 2014

Check for updates

Author Tags

  1. collaboration
  2. eye-tracking
  3. image analysis

Qualifiers

  • Research-article

Funding Sources

Conference

ETRA '14
ETRA '14: Eye Tracking Research and Applications
March 26 - 28, 2014
Florida, Safety Harbor

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)1
Reflects downloads up to 29 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2021)Multi-Sensor Eye-Tracking Systems and Tools for Capturing Student Attention and Understanding Engagement in Learning: A ReviewIEEE Sensors Journal10.1109/JSEN.2021.310570621:20(22402-22413)Online publication date: 15-Oct-2021
  • (2021)Shared Gaze Visualizations in Collaborative Interactions: Past, Present and FutureInteracting with Computers10.1093/iwcomp/iwab01533:2(115-133)Online publication date: 18-May-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media