Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3204493.3204530acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Contour-guided gaze gestures: using object contours as visual guidance for triggering interactions

Published: 14 June 2018 Publication History

Abstract

The eyes are an interesting modality for pervasive interactions, though their applicability for mobile scenarios is restricted by several issues so far. In this paper, we propose the idea of contour-guided gaze gestures, which overcome former constraints, like the need for calibration, by relying on unnatural and relative eye movements, as users trace the contours of objects in order to trigger an interaction. The interaction concept and the system design are described, along with two user studies, that demonstrate the method's applicability. It is shown that users were able to trace object contours to trigger actions from various positions on multiple different objects. It is further determined, that the proposed method is an easy to learn, hands-free interaction technique, that is robust against false positive activations. Results highlight low demand values and show that the method holds potential for further exploration, but also reveal areas for refinement.

References

[1]
Peter Abeles. 2016. BoofCV v0.25. http://boofcv.org/. (2016).
[2]
Radhakrishna Achanta, Sheila Hemami, Francisco Estrada, and Sabine Susstrunk. 2009. Frequency-tuned salient region detection. In IEEE conference on Computer vision and pattern recognition. IEEE, 1597--1604.
[3]
Mihai Bâce, Teemu Leppänen, David Gil de Gomez, and Argenis Ramirez Gomez. 2016. ubiGaze: ubiquitous augmented reality messaging using gaze gestures. In SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications. ACM, 11.
[4]
H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool. 2008. Speeded-up robust features (SURF). Computer vision and image understanding 110, 3 (2008), 346--359.
[5]
Nikolaus Bee and Elisabeth André. 2008. Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. In International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems. Springer, 111--122.
[6]
John Canny. 1986. A computational approach to edge detection. IEEE Transactions on pattern analysis and machine intelligence 6 (1986), 679--698.
[7]
Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O'Hara, and Frank Vetere. 2016. PathSync: Multi-user gestural interaction with touchless rhythmic path mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 3415--3427.
[8]
Christopher Clarke, Alessio Bellino, Eduardo Velloso, Hans Gellersen, et al. 2016. TraceMatch: A computer vision technique for user input by tracing of animated controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 298--303.
[9]
Fulvio Corno, Alastair Gale, Päivi Majaranta, and Kari-Jouko Räihä. 2010. Eye-based direct interaction for environmental control in heterogeneous smart environments. Handbook of ambient intelligence and smart environments (2010), 1117--1138.
[10]
Dietlind Helene Cymek, Antje Christine Venjakob, Stefan Ruff, Otto Hans-Martin Lutz, Simon Hofmann, and Matthias Roetting. 2014. Entering PIN codes by smooth pursuit eye movements. Journal of Eye Movement Research 7, 4 (2014).
[11]
Jifeng Dai, Yi Li, Kaiming He, and Jian Sun. 2016. R-FCN: Object Detection via Region-based Fully Convolutional Networks. CoRR abs/1605.06409 (2016). arXiv:1605.06409 http://arxiv.org/abs/1605.06409
[12]
William Delamare, Teng Han, and Pourang Irani. 2017. Designing a gaze gesture guiding system. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 26.
[13]
Murtaza Dhuliawala, Juyoung Lee, Junichi Shimizu, Andreas Bulling, Kai Kunze, Thad Starner, and Woontack Woo. 2016. Smooth eye movement interaction using EOG glasses. In Proceedings of the 18th ACM International Conference on Multimodal Interaction. ACM, 307--311.
[14]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze interaction for mobile phones. In Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology. ACM, 364--371.
[15]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. Human-Computer Interaction-INTERACT 2007 (2007), 475--488.
[16]
Morten Lund Dybdal, Javier San Agustin, and John Paulin Hansen. 2012. Gaze input for mobile devices by dwell and gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 225--228.
[17]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology. ACM, 457--466.
[18]
Pedro F Felzenszwalb and Daniel P Huttenlocher. 2004. Efficient graph-based image segmentation. International journal of computer vision 59, 2 (2004), 167--181.
[19]
M. A. Fischler and R. C. Bolles. 1981. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24, 6 (1981), 381--395.
[20]
Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with objects in the environment by gaze and hand gestures. In Proceedings of the 3rd international workshop on pervasive eye tracking and mobile eye-based interaction. 1--9.
[21]
John Paulin Hansen, Florian Biermann, Janus Askø Madsen, Morten Jonassen, Haakon Lund, Javier San Agustin, and Sebastian Sztuk. 2015. A gaze interactive textual smartwatch interface. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. ACM, 839--847.
[22]
John Paulin Hansen, Haakon Lund, Florian Biermann, Emillie Møllenbach, Sebastian Sztuk, and Javier San Agustin. 2016. Wrist-worn pervasive gaze interaction. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications. ACM, 57--64.
[23]
Sandra G Hart and Lowell E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in psychology 52 (1988), 139--183.
[24]
Michael Haslgrübler, Peter Fritz, Benedikt Gollan, and Alois Ferscha. 2017. Getting Through - Modality Selection in a Multi-Sensor-Actuator Industrial IoT Environment. In Proceedings of the 7th International Conference on the Internet of Things.
[25]
Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze gestures or dwell-based interaction?. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 229--232.
[26]
Poika Isokoski. 2000. Text input methods for eye trackers using off-screen targets. In Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 15--21.
[27]
Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. 2010. Designing gaze gestures for gaming: an investigation of performance. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications. ACM, 323--330.
[28]
Rob Jacob and Sophie Stellmach. 2016. What you look at is what you get: gaze-based user interfaces. interactions 23, 5 (2016), 62--65.
[29]
Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152--169.
[30]
Florian Jungwirth, Michaela Murauer, Michael Michael Haslgrübler, and Alois Ferscha. 2018. Eyes are different than Hands: An Analysis of Gaze as Input Modality for Industrial Man-Machine Interactions. In Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference.
[31]
Jari Kangas, Deepak Akkil, Jussi Rantala, Poika Isokoski, Päivi Majaranta, and Roope Raisamo. 2014. Gaze gestures and haptic feedback in mobile devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 435--438.
[32]
Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Adjunct Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160.
[33]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. Textpursuits: Using text for pursuits-based interaction and calibration on public displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 274--285.
[34]
Michael F Land and Sophie Furneaux. 1997. The knowledge base of the oculomotor system. Philosophical Transactions of the Royal Society of London B: Biological Sciences 352, 1358 (1997), 1231--1239.
[35]
Michael A. Lawrence. 2016. ez: Easy Analysis and Visualization of Factorial Experiments. https://CRAN.R-project.org/package=ez. (2016).
[36]
Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott E. Reed, Cheng-Yang Fu, and Alexander C. Berg. 2015. SSD: Single Shot MultiBox Detector. CoRR abs/1512.02325 (2015). arXiv:1512.02325 http://arxiv.org/abs/1512.02325
[37]
Dillon James Lohr and Oleg V Komogortsev. 2017. A Comparison of Smooth Pursuit-and Dwell-based Selection at Multiple Levels of Spatial Accuracy. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2760--2766.
[38]
Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 1 (2015).
[39]
Päivi Majaranta and Andreas Bulling. 2014. Eye tracking and eye-based human-computer interaction. In Advances in physiological computing. Springer, 39--65.
[40]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 15--22.
[41]
Sushmita Mitra and Tinku Acharya. 2007. Gesture recognition: A survey. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews) 37, 3 (2007), 311--324.
[42]
Carlos H Morimoto and Marcio RM Mimica. 2005. Eye gaze tracking techniques for interactive applications. Computer vision and image understanding 98, 1 (2005), 4--24.
[43]
Michaela Murauer, Michael Haslgrübler, and Alois Ferscha. 2017. Natural Pursuit Calibration: Using Motion Trajectories for Unobtrusive Calibration of Mobile Eye Trackers. In Proceedings of the 7th International Conference on the Internet of Things. ACM.
[44]
Matei Negulescu, Jaime Ruiz, and Edward Lank. 2012. A recognition safety net: bilevel threshold recognition for mobile motion gestures. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services. ACM, 147--150.
[45]
Takehiko Ohno and Naoki Mukawa. 2004. A free-head, simple calibration, gaze tracking system that enables gaze-based interaction. In Proceedings of the 2004 symposium on Eye tracking research & applications. ACM, 115--122.
[46]
Ken Pfeuffer, Melodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit calibration: Making gaze calibration less tedious and more flexible. In Proceedings of the 26th annual ACM symposium on User interface software and technology. ACM, 261--270.
[47]
Simon Schenk, Marc Dreiser, Gerhard Rigoll, and Michael Dorr. 2017. GazeEverywhere: Enabling Gaze-only User Interaction on an Unmodified Desktop PC in Everyday Scenarios. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 3034--3044.
[48]
Simon Schenk, Philipp Tiefenbacher, Gerhard Rigoll, and Michael Dorr. 2016. Spock: A smooth pursuit oculomotor control kit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 2681--2687.
[49]
Wei Shen, Xinggang Wang, Yan Wang, Xiang Bai, and Zhijiang Zhang. 2015. Deep-contour: A deep convolutional feature learned by positive-sharing loss for contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 3982--3991.
[50]
Linda E Sibert and Robert JK Jacob. 2000. Evaluation of eye gaze interaction. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 281--288.
[51]
Oleg Špakov and Päivi Majaranta. 2012. Enhanced gaze interaction using simple head gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing. ACM, 705--710.
[52]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct control of ambient devices by gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems. ACM, 812--817.
[53]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. ACM, 439--448.
[54]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 246--253.

Cited By

View all
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/3606947Online publication date: 30-Jun-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ETRA '18: Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications
June 2018
595 pages
ISBN:9781450357067
DOI:10.1145/3204493
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye-tracking
  2. gaze-based interaction
  3. internet of things
  4. pervasive computing
  5. wearable computing

Qualifiers

  • Research-article

Funding Sources

  • Österreichische Forschungsförderungsgesellschaft

Conference

ETRA '18

Acceptance Rates

Overall Acceptance Rate 69 of 137 submissions, 50%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)7
Reflects downloads up to 11 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/3606947Online publication date: 30-Jun-2023
  • (2022)Hybrid Target Selections by ”Hand Gestures + Facial Expression” for a Rehabilitation RobotSensors10.3390/s2301023723:1(237)Online publication date: 26-Dec-2022
  • (2022)Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral VisionInternational Journal of Environmental Research and Public Health10.3390/ijerph19171073719:17(10737)Online publication date: 29-Aug-2022
  • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
  • (2022)Head and Eye Egocentric Gesture Recognition for Human-Robot Interaction Using Eyewear CamerasIEEE Robotics and Automation Letters10.1109/LRA.2022.31804427:3(7067-7074)Online publication date: Jul-2022
  • (2021)Smooth Pursuit Study on an Eye-Control System for Continuous Variable Adjustment TasksInternational Journal of Human–Computer Interaction10.1080/10447318.2021.201297939:1(23-33)Online publication date: 15-Dec-2021
  • (2020)Eye Tracking for Target Acquisition in Sparse VisualizationsACM Symposium on Eye Tracking Research and Applications10.1145/3379156.3391834(1-5)Online publication date: 2-Jun-2020
  • (2020)Outline Pursuits: Gaze-assisted Selection of Occluded Objects in Virtual RealityProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376438(1-13)Online publication date: 21-Apr-2020
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media