Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2950112.2964584acmotherconferencesArticle/Chapter ViewAbstractPublication Pageshci-aeroConference Proceedingsconference-collections
research-article

Exploratory study with eye tracking devices to build interactive systems for air traffic controllers

Published: 14 September 2016 Publication History

Abstract

While the mouse is the main input device for interacting with different screens, many alternatives do exist. In this article, we report our exploratory study with the usage of eyes as a new input device for Air Traffic Control systems. Our investigations, based on a user-centered design, include a study of the activity, a classification of interaction techniques based on eye tracking systems, and finally a working prototype with the evaluations of the developed interaction techniques. Our goal is to investigate gaze usages as a means of interaction, and give recommendations for future development of Air Traffic Control systems.

References

[1]
Alonso R., Causse M., Vachon F., Parise R., Dehais F., Terrier P. Evaluation of head-free eye tracking as an input device for air traffic control. Ergonomics, vol. 56, no. 2, pp. 246--255, 2013.
[2]
Baldauf M., Fröhlich P., Hutter S. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration. In Proc. AH '10, ACM (2010).
[3]
Beaudouin-Lafon M., Mackay W. Prototyping tools and techniques. In The human-computer interaction handbook, Julie A. Jacko and Andrew Sears (Eds.). L. Erlbaum Associates Inc., 1006--1031.
[4]
Biswas P., Langdon P. A new interaction technique involving eye gaze tracker and scanning system. In Proc. ETSA '13, ACM (2013), 67--70.
[5]
Blanch R., Ortega M. Rake cursor: improving pointing performance with concurrent input channels. In Proc. CHI '09, ACM (2009), 1415--1418.
[6]
Drewes H., Schmidt A. The MAGIC Touch: Combining MAGIC-Pointing with a TouchSensitive Mouse. In Proc. INTERACT '09.
[7]
Dybdal M., Agustin J., Hansen P. Gaze input for mobile devices by dwell and gestures. In Proc. ETRA '12, ACM (2012), 225--228.
[8]
Fares R., Downing D., Komogortsev O. Magic-sense: dynamic cursor sensitivity-based magic pointing. In CHI EA '12, ACM (2012), 2489--2494.
[9]
Fitts, Jones, Milton. Eye movements of aircraft pilots during instrument-landing approaches. In Aeronautical Engineering Review 9(2), 1950, 24--29.
[10]
Göbel F., Klamka K., Siegel A., Vogt S., Stellmach S., Dachselt R. Gazesupported foot interaction in zoomable information spaces. In CHI EA '13, ACM (2013), 3059--3062.
[11]
Grossman T., Balakrishnan R. The bubble cursor: enhancing target acquisition by dynamic resizing of the cursor's activation area. In Proc. CHI '05. ACM (2005), 281--290.
[12]
Hurter C., Lesbordes R., Letondal C., Vinot J., Conversy S. Strip-TIC: exploring augmented paper strips for Air Traffic Controllers. In AVI 2012, 225--232.
[13]
Jacob R., Karn K. Eye tracking in Human-Computer Interaction and usability research: Ready to deliver the promises, In J. Hyönä, R. Radach, & H.Deubel (Eds.), The mind's eye: Cognitive and applied aspects of eye movement research, 2003, pp. 573--605.
[14]
Johansen S., Agustin J., Skovsgaard H., Hansen J., Tall M. Low cost vs. high-end eye tracking for usability testing. In CHI EA '11, ACM (2011), 1177--11 82.
[15]
Komogortsev V., Ryu Y., Koh D., Gowda S. Instantaneous saccade driven eye gaze interaction. In Proc. ACE '09, ACM (2009), 140--147.
[16]
Letondal C., Hurter C., Lesbordes R., Vinot J., Conversy S. Flights in my hands: coherence concerns in designing Strip'TIC, a tangible space for air traffic controllers. In Proc. CHI '13. ACM (2013), 2175--2184.
[17]
Mackay W., Fayard A., Frobert L., Médini L. Reinventing the familiar: exploring an augmented reality design space for air traffic control. In Proc CHI '98. ACM Press, 558--565.
[18]
Majaranta P., Aula A., Räihä K. Effects of feedback on eye typing with a short dwell time. In Proc. ETRA '04, ACM (2004), 139--146.
[19]
Majaranta P., Räihä K. Twenty years of eye typing: systems and design issues. In Proc ETRA '02, ACM (2002), 15--22.
[20]
Mardanbegi D., Hansen D. Mobile gaze-based screen interaction in 3D environments. In Proc. NGCA '11, ACM (2011).
[21]
Mateo J., Agustin J., Hansen J. Gaze beats mouse: hands-free selection by combining gaze and emg. In CHI EA '08, ACM (2008), 3039--3044.
[22]
Merchant S., Schnell T. Applying eye tracking as an alternative approach for activation of controls and functions in aircraft. In Proc. DASC, 2000.
[23]
Miniotas D., Špakov O., Tugoy I., MacKenzie I. Speech-augmented eye gaze interaction with small closely spaced targets. In Proc. ETRA '06, ACM (2006), 67--72.
[24]
Nagamatsu T., Yamamoto M., Hiroshi S. MobiGaze: development of a gaze interface for handheld mobile devices. In CHI EA '10, ACM (2010), 3349--3354.
[25]
Räihä K., Špakov O. Disambiguating ninja cursors with eye gaze. In Proc CHI '09. ACM (2009), 1411--1414.
[26]
Sibert E., Jacob R. Evaluation of eye gaze interaction. In Proc. CHI '00, ACM (2000), 281--288.
[27]
Stellmach S., Dachselt R. Investigating gaze-supported multimodal pan and zoom. In Proc. ETRA '12, ACM (2012), 357--360.
[28]
Stellmach S., Dachselt R. Look & touch: gaze-supported target acquisition. In Proc. CHI '12, ACM (2012), 2981--2990.
[29]
Stellmach S., Dachselt R. Still looking: investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In Proc. CHI '13. ACM (2013), 285--294.
[30]
Stellmach S., Stober S., Nürnberger A., Dachselt R. Designing gazesupported multimodal interactions for the exploration of large image collections. In Proc. NGCA '11. ACM (2011).
[31]
Turner J., Alexander J., Bulling A., Schmidt D., Gellersen H. Eye Pull, Eye Push: Moving Objects between Large Screens and Personal Devices with Gaze and Touch
[32]
Vertegaal R. A Fitts Law comparison of eye tracking and manual input in the selection of visual targets. In Proc. ICMI '08, ACM (2008), 241--248.
[33]
Ware C., Mikaelian H. An evaluation of an eye tracker as a device for computer input2. In Proc. CHI '87, ACM (1986), 183--188.
[34]
Yamato M., Inoue K., Monden A., Torii K., Matsumoto K. Button selection for general GUIs using eye and hand together. In Proc. AVI '00, ACM (2000), 270--273.
[35]
Zhai S., Morimoto C., Ihde S. Manual and gaze input cascaded (MAGIC) pointing. In Proc. CHI '99, ACM (1999).
[36]
Zhao X., Guestrin E., Sayenko D., Simpson T., Gauthier M., Popovic M. Typing with eye-gaze and tooth-clicks. In Proc. ETRA '12.
[37]
Hurter C., Conversy S., Gianazza D., Telea A. Interactive image-based information visualization for aircraft trajectory analysis. In Transp. Res. Part C Emerg. Technol., vol. 47, Part 2, p. 207--227, oct. 2014.

Cited By

View all
  • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
  • (2023)Highlighting the Challenges of Blinks in Eye Tracking for Interactive SystemsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589202(1-7)Online publication date: 30-May-2023
  • (2021)Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working PositionsAerospace10.3390/aerospace80902458:9(245)Online publication date: 3-Sep-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
HCI-Aero '16: Proceedings of the International Conference on Human-Computer Interaction in Aerospace
September 2016
157 pages
ISBN:9781450344067
DOI:10.1145/2950112
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 September 2016

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. ethnographic studies
  2. multi-modal interaction
  3. new technologies

Qualifiers

  • Research-article

Conference

HCI-Aero '16

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 26 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
  • (2023)Highlighting the Challenges of Blinks in Eye Tracking for Interactive SystemsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589202(1-7)Online publication date: 30-May-2023
  • (2021)Integrating Eye- and Mouse-Tracking with Assistant Based Speech Recognition for Interaction at Controller Working PositionsAerospace10.3390/aerospace80902458:9(245)Online publication date: 3-Sep-2021
  • (2020)Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”Aerospace10.3390/aerospace70200157:2(15)Online publication date: 20-Feb-2020
  • (2020)From Paper Flight Strips to Digital Strip Systems: Changes and Similarities in Air Traffic Control Work PracticesProceedings of the ACM on Human-Computer Interaction10.1145/33928334:CSCW1(1-21)Online publication date: 29-May-2020
  • (2020)Identifying Interesting Moments in Controllers Work Video via Dimensionality Reduction2020 International Conference on Artificial Intelligence and Data Analytics for Air Transportation (AIDA-AT)10.1109/AIDA-AT48540.2020.9049170(1-10)Online publication date: Feb-2020
  • (2018)Faster Command Input Using the Multimodal Controller Working Position “TriControl”Aerospace10.3390/aerospace50200545:2(54)Online publication date: 8-May-2018

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media