Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3173574.3173908acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Better Understanding of Foot Gestures: An Elicitation Study

Published: 21 April 2018 Publication History

Abstract

We present a study aimed to better understand users' perceptions of foot gestures employed on a horizontal surface. We applied a user elicitation methodology, in which participants were asked to suggest foot gestures to actions (referents) in three conditions: standing up in front of a large display, sitting down in front of a desktop display, and standing on a projected surface. Based on majority count and agreement scores, we identified three gesture sets, one for each condition. Each gesture set shows a mapping between a common action and its chosen gesture. As a further contribution, we suggest a new measure called specification score, which indicates the degree to which a gesture is specific, preferable and intuitive to an action in a specific condition of use. Finally, we present measurable insights that can be implemented as guidelines for future development and research of foot interaction.

Supplementary Material

RAR File (pn3060.rar)
suppl.mov (pn3060-file5.mp4)
Supplemental video
suppl.mov (pn3060.mp4)
Supplemental video

References

[1]
Jason Alexander, Teng Han, William Judd, Pourang Irani, Sriram Subramanian. 2012. Putting your best foot forward: investigating real-world mappings for foot-based gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1229--1238.
[2]
Thomas Augsten, Konstantin Kaefer, René Meusel, Caroline Fetzer, Dorian Kanitz, Thomas Stoff, Torsten Becker, Christian Holz, and Patrick Baudisch. 2010. Multitoe: high-precision interaction with backprojected floors based on high-resolution multi-touch input. In Proceedings of the 23nd annual ACM symposium on User interface software and technology, 209--218
[3]
Jean P. Chapman, Loren J. Chapman and John J. Allen. 1987. The Measurement of Foot Preference. Neuropsychologia Rev 25, 3: 579--584
[4]
Andrew Crossan, Stephen Brewster, Alexander Ng. 2010. Foot tapping for mobile interaction. In Proceedings of the 24th BCS Interaction Specialist Group Conference, 418--422
[5]
Ilya Efanov, Joel Lanir. 2015. Augmenting Indirect Multi-Touch Interaction with 3D Hand Contours and Skeletons. In Extended Abstracts of the ACM CHI'15 Conference on Human Factors in Computing Systems, 989--994.
[6]
Göbel Fabian, Konstantin Klamka, Andreas Siegel, Stefan Vogt, Sophie Stellmach, and Raimund Dachselt. 2013. Gaze-supported foot interaction in zoomable information spaces. In CHI'13 Extended Abstracts on Human Factors in Computing Systems, 3059--3062.
[7]
Koumei Fukahori, Daisuke Sakamoto, and Takeo Igarashi. 2015. Exploring Subtle Foot Plantar-based Gestures with Sock-placed Pressure Sensors. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 3019--3028.
[8]
Yasmin Felberbaum, and Joel Lanir. 2016. Step by Step: Investigating Foot Gesture Interaction. In Proceedings of the International Working Conference on Advanced Visual Interfaces, 306--307.
[9]
Teng Han, Jason Alexander, Abhijit Karnik, Pourang Irani and Sriram Subramanian. 2011. Kick?: Investigating the Use of Kick Gestures for Mobile Interactions. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, 29--32.
[10]
Jota Ricardo, Pedro Lopes, Daniel Wigdor, and Joaquim Jorge. 2014. Let's kick it: how to stop wasting the bottom third of your large screen display. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems, 1411--1414.
[11]
Byron Lahey, Audrey Girouard, Winslow Burleson, and Roel Vertegaal. 2011. PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11), 1303--1312.
[12]
Jr LaViola, Joseph J., Daniel Acevedo Feliz, Daniel F. Keefe, and Robert C. Zeleznik. 2001. Hands-free multi-scale navigation in virtual environments. In Proceedings of the 2001 symposium on Interactive 3D graphics, 9--15.
[13]
ChanSu Lee, SangWon Ghyme, ChanJong Park, and KwangYun Wohn. 1998. The control of avatar motion using hand gesture. In Proceedings of the ACM symposium on Virtual reality software and technology, 59--65.
[14]
Zhihan Lv, Muhammad Sikandar Lal Khan, and Shafiq Ur Réhman. 2013. Hand and foot gesture interaction for handheld devices. In Proceedings of the 21st ACM international conference on Multimedia, 621--624.
[15]
Zhihan Lv, Shengzhong Feng, Muhammad Sikandar Lal Khan, Shafiq Ur Réhman and Haibo Li. 2014. Foot motion sensing: augmented game interface based on foot interaction for smartphone. In CHI'14 Extended Abstracts on Human Factors in Computing Systems, 293--296.
[16]
Volker Paelke, Christian Reimann, and Dirk Stichling. 2004. Kick-up menus. In CHI'04 Extended Abstracts on Human Factors in Computing Systems, 1552--1552.
[17]
Toni Pakkanen and Roope Raisamo. 2004. Appropriateness of foot interaction for non-accurate spatial tasks. In CHI'04 extended abstracts on Human factors in computing systems, 1123--1126.
[18]
Bruce Richardson, Krispin Leydon, Mikael Fernstrom, and Joseph A. Paradiso. 2004. Z-Tiles: building blocks for modular, pressure-sensing floorspaces. In CHI'04 extended abstracts on Human factors in computing systems, 1529--1532.
[19]
Jaime Ruiz, Yang Li, and Edward Lank. 2011. Userdefined motion gestures for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 197--206.
[20]
William Saunders, and Daniel Vogel. 2016. Tap-KickClick: Foot Interaction for a Standing Desk. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, 323--333.
[21]
Johannes Schöning, Florian Daiber, Antonio Krüger, and Michael Rohs. 2009. Using hands and feet to navigate and manipulate spatial data. In CHI'09 Extended Abstracts on Human Factors in Computing Systems, 4663--4668.
[22]
Jeremy Scott, David Dearman, Koji Yatani and Khai N. Truong. 2010. Sensing foot gestures from the pocket. In Proceedings of the 23nd annual ACM symposium on User interface software and technology, 199--208.
[23]
Mara G. Silva and Doug A. Bowman. 2009. Bodybased interaction for desktop games. In CHI'09 Extended Abstracts on Human Factors in Computing Systems, 4249--4254.
[24]
Adalberto L. Simeone, Eduardo Velloso, Jason Alexander, and Hans Gellersen. 2014. Feet movement in desktop 3D interaction. In 3D User Interfaces (3DUI), 2014 IEEE Symposium on, 71--74.
[25]
Helman I. Stern, Juan P. Wachs and Yael Edan. 2008. Optimal consensus intuitive hand gesture vocabulary design. In IEEE International Conference on Semantic Computing, 2008, 96--103.
[26]
Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2015. Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 1325--1334.
[27]
Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2016. Between-subjects elicitation studies: Formalization and tool support. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 3390--3402.
[28]
Eduardo Velloso, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. Interactions under the desk: A characterisation of foot movements for input in a seated position. In Human-Computer Interaction. pp. 384--401. Springer.
[29]
Eduardo Velloso, Dominik Schmidt, Jason Alexander, Hans Gellersen and Andreas Bulling. 2015. The feet in human-computer interaction: A survey of foot-based interaction. ACM Computing Surveys (CSUR) 48, 2: 21.
[30]
Daniel Wigdor and Dennis Wixon. 2011. Brave NUI world: designing natural user interfaces for touch and gesture. Elsevier.
[31]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the guessability of symbolic input. In CHI'05 extended abstracts on Human Factors in Computing Systems,1869--1872.
[32]
Jacob O. Wobbrock, Meredith Ringel Morris and Andrew D. Wilson. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1083--1092.
[33]
Tetsuya Yamamoto, Tsutomu Terada, and Masahiko Tsukamoto. 2011. Designing gestures for hands and feet in daily life. In Proceedings of the 9th International Conference on Advances in Mobile Computing and Multimedia, 285--288.
[34]
KangKang Yin and Dinesh K. Pai. 2003. Footsee: an interactive animation system. In Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation, 329--338.

Cited By

View all
  • (2024)Exploration of Foot-based Text Entry Techniques for Virtual Reality EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642757(1-17)Online publication date: 11-May-2024
  • (2024)Converting Tatamis into Touch Sensors by Measuring Capacitance2024 IEEE/SICE International Symposium on System Integration (SII)10.1109/SII58957.2024.10417676(554-558)Online publication date: 8-Jan-2024
  • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
  • Show More Cited By

Index Terms

  1. Better Understanding of Foot Gestures: An Elicitation Study

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
    April 2018
    8489 pages
    ISBN:9781450356206
    DOI:10.1145/3173574
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 April 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. elicitation study
    2. foot gestures
    3. foot interaction
    4. user-defined gesture set

    Qualifiers

    • Research-article

    Conference

    CHI '18
    Sponsor:

    Acceptance Rates

    CHI '18 Paper Acceptance Rate 666 of 2,590 submissions, 26%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)117
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 30 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploration of Foot-based Text Entry Techniques for Virtual Reality EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642757(1-17)Online publication date: 11-May-2024
    • (2024)Converting Tatamis into Touch Sensors by Measuring Capacitance2024 IEEE/SICE International Symposium on System Integration (SII)10.1109/SII58957.2024.10417676(554-558)Online publication date: 8-Jan-2024
    • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
    • (2023)Effects of Footpad Slope, Movement Direction and Contact Part of Foot on Foot-Based InteractionsApplied Sciences10.3390/app1311663613:11(6636)Online publication date: 30-May-2023
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)Exploring Locomotion Methods with Upright Redirected Views for VR Users in Reclining & Lying PositionsProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606714(1-16)Online publication date: 29-Oct-2023
    • (2023)DataDancing: An Exploration of the Design Space For Visualisation View Management for 3D Surfaces and SpacesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580827(1-17)Online publication date: 19-Apr-2023
    • (2022)Mapping Multimodel Scrolling Techniques for Foot-based Interaction Depending on the Cursor PositionProceedings of the 2nd International Conference on Computing Advancements10.1145/3542954.3543066(89-96)Online publication date: 10-Mar-2022
    • (2022)Shoes++Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346206:2(1-29)Online publication date: 7-Jul-2022
    • (2022)Persephone’s Feet: A Foot-Based Approach to Play in Virtual RealityExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3516489(1-6)Online publication date: 27-Apr-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media