Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3379503.3403538acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

Headbang: Using Head Gestures to Trigger Discrete Actions on Mobile Devices

Published: 05 October 2020 Publication History

Abstract

We present Headbang, an interaction technique that enriches touch input on handheld devices through slight head movement gestures. This way, users can easily execute shortcuts, like Copy, Paste, or Share, to on-screen targets while touching them. Headbang utilizes the capabilities of commodity smartphones to track the user’s head with their front facing cameras. We evaluated Headbang in two studies and show that the system can be reliably used while sitting and walking and offers a similar accuracy and speed as touch interaction.

Supplementary Material

MP4 File (a17-hueber-supplement.mp4)

References

[1]
Caroline Appert and Shumin Zhai. 2009. Using Strokes As Command Shortcuts: Cognitive Benefits and Toolkit Support. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’09). ACM, New York, NY, USA, 2289–2298. https://doi.org/10.1145/1518701.1519052 Boston, MA, USA.
[2]
Mathias Baglioni, Eric Lecolinet, and Yves Guiard. 2011. JerkTilts: Using Accelerometers for Eight-Choice Selection on Mobile Devices. In Proceedings of the 13th International Conference on Multimodal Interfaces(ICMI ’11). ACM, New York, NY, USA, 121–128. https://doi.org/10.1145/2070481.2070503 Alicante, Spain.
[3]
Sebastian Boring, David Ledo, Xiang ’Anthony’ Chen, Nicolai Marquardt, Anthony Tang, and Saul Greenberg. 2012. The Fat Thumb: Using the Thumb’s Contact Size for Single-Handed Mobile Interaction. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services(MobileHCI ’12). ACM, New York, NY, USA, 39–48. https://doi.org/10.1145/2371574.2371582 San Francisco, California, USA.
[4]
Xiang ’Anthony’ Chen, Julia Schwarz, Chris Harrison, Jennifer Mankoff, and Scott E. Hudson. 2014. Air+Touch: Interweaving Touch & In-Air Gestures. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology(UIST ’14). ACM, New York, NY, USA, 519–525. https://doi.org/10.1145/2642918.2647392 Honolulu, Hawaii, USA.
[5]
Christian Corsten, Marcel Lahaye, Jan Borchers, and Simon Voelker. 2019. ForceRay: Extending Thumb Reach via Force Input Stabilizes Device Grip for Mobile Touch Input. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems(CHI ’19). ACM, New York, NY, USA, 212:1–212:10. https://doi.org/10.1145/3290605.3300442 Glasgow, Scotland UK.
[6]
Christian Corsten, Simon Voelker, Andreas Link, and Jan Borchers. 2018. Use the Force Picker, Luke: Space-Efficient Value Input on Force-Sensitive Mobile Touchscreens. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). ACM, New York, NY, USA, 661:1–661:12. https://doi.org/10.1145/3173574.3174235 Montreal, Canada.
[7]
D. A. Craig and H. T. Nguyen. 2005. Wireless Real-Time Head Movement System Using a Personal Digital Assistant (PDA) for Control of a Power Wheelchair. In 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference. IEEE, New York, NY, USA, 772–775. https://doi.org/10.1109/IEMBS.2005.1616529
[8]
Andrew Crossan, Mark McGill, Stephen Brewster, and Roderick Murray-Smith. 2009. Head Tilting for Interaction in Mobile Contexts. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services(MobileHCI ’09). ACM, New York, NY, USA, 6:1–6:10. https://doi.org/10.1145/1613858.1613866 Bonn, Germany.
[9]
Augusto Esteves, David Verweij, Liza Suraiya, Rasel Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology(UIST ’17). ACM, New York, NY, USA, 167–178. https://doi.org/10.1145/3126594.3126616 Québec City, QC, Canada.
[10]
Yulia Gizatdinova, Oleg Špakov, Outi Tuisku, Matthew Turk, and Veikko Surakka. 2018. Gaze and Head Pointing for Hands-Free Text Entry: Applicability to Ultra-Small Virtual Keyboards. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (Warsaw, Poland) (ETRA ’18). Association for Computing Machinery, New York, NY, USA, Article 14, 9 pages. https://doi.org/10.1145/3204493.3204539
[11]
Dmitry O. Gorodnichy and Gerhard Roth. 2004. Nouse ‘Use your Nose as a Mouse’ Perceptual Vision Technology for Hands-Free Games and Interfaces. Image and Vision Computing 22, 12 (Oct. 2004), 931–942. https://doi.org/10.1016/j.imavis.2004.03.021
[12]
Uta Hinrichs and Sheelagh Carpendale. 2011. Gestures in the Wild: Studying Multi-Touch Gesture Sequences on Interactive Tabletop Exhibits. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’11). ACM, New York, NY, USA, 3023–3032. https://doi.org/10.1145/1978942.1979391 Vancouver, BC, Canada.
[13]
Nambu Hirotaka. 2003. Reassessing Current Cell Phone Designs: Using Thumb Input Effectively. In CHI ’03 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’03). ACM, New York, NY, USA, 938–939. https://doi.org/10.1145/765891.766081 Ft. Lauderdale, Florida, USA.
[14]
Thibaut Jacob, Gilles Bailly, Eric Lecolinet, Géry Casiez, and Marc Teyssier. 2016. Desktop Orbital Camera Motions Using Rotational Head Movements. In Proceedings of the 2016 Symposium on Spatial User Interaction(SUI ’16). ACM, New York, NY, USA, 139–148. https://doi.org/10.1145/2983310.2985758 Tokyo, Japan.
[15]
Amy K. Karlson, Benjamin B. Bederson, and Jose L. Contreras-Vidal. 2008. Understanding One-Handed Use of Mobile Devices. Handbook of Research on User Interface Design and Evaluation for Mobile Technology 1, 1 (2008), 86–101. https://doi.org/10.4018/978-1-59904-871-0.ch006
[16]
Viktoria A. Kettner and Jeremy I. M. Carpendale. 2013. Developing Gestures for No and Yes: Head Shaking and Nodding in Infancy. Gesture 13, 2 (Jan. 2013), 193–209. https://doi.org/10.1075/gest.13.2.04ket
[17]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018. The Past, Present, and Future of Gaze-Enabled Handheld Mobile Devices: Survey and Lessons Learned. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI ’18. ACM, New York, NY, USA, 1–17. https://doi.org/10.1145/3229434.3229452 Barcelona, Spain.
[18]
Mohamed Khamis, Anita Baier, Niels Henze, Florian Alt, and Andreas Bulling. 2018. Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones Used in the Wild. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). ACM, New York, NY, USA, 280:1–280:12. https://doi.org/10.1145/3173574.3173854 Montreal QC, Canada.
[19]
Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems(CHI ’18). ACM, New York, NY, USA, 81:1–81:14. https://doi.org/10.1145/3173574.3173655 Montreal QC, Canada.
[20]
Yang Li. 2010. Gesture Search: A Tool for Fast Mobile Data Access. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology(UIST ’10). ACM, New York, NY, USA, 87–96. https://doi.org/10.1145/1866029.1866044 New York, New York, USA.
[21]
Edmund LoPresti, David M. Brienza, Jennifer Angelo, Lars Gilbertson, and Jonathan Sakai. 2000. Neck Range of Motion and Use of Computer Head Controls. In Proceedings of the Fourth International ACM Conference on Assistive Technologies(Assets ’00). ACM, New York, NY, USA, 121–128. https://doi.org/10.1145/354324.354352 Arlington, Virginia, USA.
[22]
Tao Lu, Kui Yuan, Huosheng H. Hu, and Pei Jia. 2007. Head Gesture Recognition for Hands‐Free Control of an Intelligent Wheelchair. Industrial Robot: The International Journal of Robotics Research and Application 34, 1 (Jan. 2007), 60–68. https://doi.org/10.1108/01439910710718469
[23]
Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based Head Gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications(ETRA ’12). ACM, New York, NY, USA, 139–146. https://doi.org/10.1145/2168556.2168578 event-place: Santa Barbara, California.
[24]
Evelyn Z. McClave. 2000. Linguistic Functions of Head Movements in the Context of Speech. Journal of Pragmatics 32, 7 (June 2000), 855–878. https://doi.org/10.1016/S0378-2166(99)00079-X
[25]
Louis-Philippe Morency and Trevor Darrell. 2006. Head Gesture Recognition in Intelligent Interfaces: The Role of Context in Improving Recognition. In Proceedings of the 11th International Conference on Intelligent User Interfaces(IUI ’06). ACM, New York, NY, USA, 32–38. https://doi.org/10.1145/1111449.1111464 Sydney, Australia.
[26]
Tomi Nukarinen, Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction(NordiCHI ’16). ACM, New York, NY, USA, 43:1–43:8. https://doi.org/10.1145/2971485.2971490 Gothenburg, Sweden.
[27]
I. Oakley and S. O’Modhrain. 2005. Tilt to scroll: evaluating a motion based vibrotactile mobile interface. In First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference. IEEE, New York, NY, USA, 40–49. https://doi.org/10.1109/WHC.2005.138
[28]
Kurt Partridge, Saurav Chatterjee, Vibha Sazawal, Gaetano Borriello, and Roy Want. 2002. TiltType: Accelerometer-Supported Text Entry for Very Small Devices. In Proceedings of the 15th Annual ACM Symposium on User Interface Software and Technology (Paris, France) (UIST ’02). Association for Computing Machinery, New York, NY, USA, 201–204. https://doi.org/10.1145/571985.572013
[29]
Robert G. Radwin, Gregg C. Vanderheiden, and Mei-Li Lin. 1990. A Method for Evaluating Head-Controlled Computer Input Devices Using Fitts’ Law. Human Factors 32, 4 (Aug. 1990), 423–438. https://doi.org/10.1177/001872089003200405
[30]
Jun Rekimoto. 1996. Tilting Operations for Small Screen Interfaces. In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology (Seattle, Washington, USA) (UIST ’96). Association for Computing Machinery, New York, NY, USA, 167–168. https://doi.org/10.1145/237091.237115
[31]
H. H. Sad and F. Poirier. 2009. Evaluation and Modeling of User Performance for Pointing and Scrolling Tasks on Handheld Devices Using Tilt Sensor. In 2009 Second International Conferences on Advances in Computer-Human Interactions. IEEE, New York, NY, USA, 295–300. https://doi.org/10.1109/ACHI.2009.15
[32]
Joey Scarr, Andy Cockburn, and Carl Gutwin. 2013. Supporting and Exploiting Spatial Memory in User Interfaces. Foundations and Trends® in Human–Computer Interaction 6, 1(2013), 1–84. https://doi.org/10.1561/1100000046
[33]
Zhenyu Tang, Chenyu Yan, Sijie Ren, and Huagen Wan. 2017. HeadPager: Page Turning with Computer Vision Based Head Interaction. In Computer Vision – ACCV 2016 Workshops(Lecture Notes in Computer Science), Chu-Song Chen, Jiwen Lu, and Kai-Kuang Ma (Eds.). Springer International Publishing, Cham, Switzerland, 249–257.
[34]
Robert J. Teather and I. Scott MacKenzie. 2014. Position vs. Velocity Control for Tilt-Based Interaction. In Proceedings of Graphics Interface 2014 (Montreal, Quebec, Canada) (GI ’14). Canadian Information Processing Society, CAN, 51–58.
[35]
Feng Tian, Lishuang Xu, Hongan Wang, Xiaolong Zhang, Yuanyuan Liu, Vidya Setlur, and Guozhong Dai. 2008. Tilt Menu: Using the 3D Orientation Information of Pen Devices to Extend the Selection Capability of Pen-Based User Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Florence, Italy) (CHI ’08). Association for Computing Machinery, New York, NY, USA, 1371–1380. https://doi.org/10.1145/1357054.1357269
[36]
Simon Voelker, Sebastian Hueber, Christian Corsten, and Christian Remy. 2020. HeadReach: Using Head Tracking to Increase Reachability on Mobile Touch Devices. In To appear in the Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu Hawai’i, USA) (CHI ’20). ACM, New York, NY, USA. https://doi.org/10.1145/3313831.3376868
[37]
Chat Wacharamanotham, Jan Hurtmanns, Alexander Mertens, Martin Kronenbuerger, Christopher Schlick, and Jan Borchers. 2011. Evaluating Swabbing: A Touchscreen Input Method for Elderly Users with Tremor. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’11). ACM, New York, NY, USA, 623–626. https://doi.org/10.1145/1978942.1979031 Vancouver, BC, Canada.
[38]
Daniel Wigdor and Ravin Balakrishnan. 2003. TiltText: Using Tilt for Text Input to Mobile Phones. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (Vancouver, Canada) (UIST ’03). Association for Computing Machinery, New York, NY, USA, 81–90. https://doi.org/10.1145/964696.964705
[39]
Julie R. Williamson, Stephen Brewster, and Rama Vennelakanti. 2013. Mo!Games: Evaluating Mobile Gestures in the Wild. In Proceedings of the 15th ACM on International Conference on Multimodal Interaction(ICMI ’13). ACM, New York, NY, USA, 173–180. https://doi.org/10.1145/2522848.2522874 Sydney, Australia.
[40]
Graham Wilson, Stephen A. Brewster, Martin Halvey, Andrew Crossan, and Craig Stewart. 2011. The Effects of Walking, Feedback and Control Method on Pressure-based Interaction. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services(MobileHCI ’11). ACM, New York, NY, USA, 147–156. https://doi.org/10.1145/2037373.2037397 event-place: Stockholm, Sweden.
[41]
Yukang Yan, Chun Yu, Xin Yi, and Yuanchun Shi. 2018. HeadGesture: Hands-Free Input Approach Leveraging Head Movements for HMD Devices. Proceedings of the ACM on Interactive, Mobile, Wearable, and Ubiquitous Technologies (IMWUT ’18) 2, 4 (Dec. 2018), 198:1–198:23. https://doi.org/10.1145/3287076
[42]
Shanhe Yi, Zhengrui Qin, Ed Novak, Yafeng Yin, and Qun Li. 2016. GlassGesture: Exploring Head Gesture Interface of Smart Glasses. In IEEE INFOCOM 2016 - The 35th Annual IEEE International Conference on Computer Communications. IEEE, New York, NY, USA, 1–9. https://doi.org/10.1109/INFOCOM.2016.7524542
[43]
Chi Zhang, Nan Jiang, and Feng Tian. 2016. Accessing Mobile Apps with User Defined Gesture Shortcuts: An Exploratory Study. In Proceedings of the 2016 ACM International Conference on Interactive Surfaces and Spaces(ISS ’16). ACM, New York, NY, USA, 385–390. https://doi.org/10.1145/2992154.2996786 Niagara Falls, Ontario, Canada.
[44]
Oleg Špakov, Poika Isokoski, Jari Kangas, Jussi Rantala, Deepak Akkil, and Roope Raisamo. 2016. Comparison of Three Implementations of HeadTurn: A Multimodal Interaction Technique with Gaze and Head Turns. In Proceedings of the 18th ACM International Conference on Multimodal Interaction(ICMI ’16). ACM, New York, NY, USA, 289–296. https://doi.org/10.1145/2993148.2993153 Tokyo, Japan.

Cited By

View all
  • (2024)Towards Personalized Head-Tracking PointingExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650996(1-7)Online publication date: 11-May-2024
  • (2023)HeadarProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109007:3(1-28)Online publication date: 27-Sep-2023
  • (2023)TicTacToes: Assessing Toe Movements as an Input ModalityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580954(1-17)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
MobileHCI '20: 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services
October 2020
418 pages
ISBN:9781450375160
DOI:10.1145/3379503
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 05 October 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. head gestures
  2. head tracking
  3. mobile devices
  4. pie menus

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

MobileHCI '20
Sponsor:

Acceptance Rates

Overall Acceptance Rate 202 of 906 submissions, 22%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)3
Reflects downloads up to 14 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Towards Personalized Head-Tracking PointingExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650996(1-7)Online publication date: 11-May-2024
  • (2023)HeadarProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109007:3(1-28)Online publication date: 27-Sep-2023
  • (2023)TicTacToes: Assessing Toe Movements as an Input ModalityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580954(1-17)Online publication date: 19-Apr-2023
  • (2022)HandyGaze: A Gaze Tracking Technique for Room-Scale Environments using a Single SmartphoneProceedings of the ACM on Human-Computer Interaction10.1145/35677156:ISS(143-160)Online publication date: 14-Nov-2022
  • (2022)FaceOri: Tracking Head Position and Orientation Using Ultrasonic Ranging on EarphonesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517698(1-12)Online publication date: 29-Apr-2022
  • (2022)Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor ImpairmentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501964(1-16)Online publication date: 29-Apr-2022
  • (2022)Head and Eye Egocentric Gesture Recognition for Human-Robot Interaction Using Eyewear CamerasIEEE Robotics and Automation Letters10.1109/LRA.2022.31804427:3(7067-7074)Online publication date: Jul-2022
  • (2022)GearWheels: A Software Tool to Support User Experiments on Gesture Input with Wearable DevicesInternational Journal of Human–Computer Interaction10.1080/10447318.2022.2098907(1-19)Online publication date: 22-Jul-2022
  • (2021)FacialPen: Using Facial Detection to Augment Pen-Based InteractionProceedings of the Asian CHI Symposium 202110.1145/3429360.3467672(1-8)Online publication date: 8-May-2021

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media