Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2669485.2669511acmconferencesArticle/Chapter ViewAbstractPublication PagesissConference Proceedingsconference-collections
research-article

User-defined Interface Gestures: Dataset and Analysis

Published: 16 November 2014 Publication History

Abstract

We present a video-based gesture dataset and a methodology for annotating video-based gesture datasets. Our dataset consists of user-defined gestures generated by 18 participants from a previous investigation of gesture memorability. We design and use a crowd-sourced classification task to annotate the videos. The results are made available through a web-based visualization that allows researchers and designers to explore the dataset. Finally, we perform an additional descriptive analysis and quantitative modeling exercise that provide additional insights into the results of the original study. To facilitate the use of the presented methodology by other researchers we share the data, the source of the human intelligence tasks for crowdsourcing, a new taxonomy that integrates previous work, and the source code of the visualization tool.

Supplementary Material

ZIP File (its0192-file4.zip)

References

[1]
Alexander, J., Han, T., Judd, W., Irani, P., and Subramanian, S. Putting your best foot forward: Investigating real-world mappings for foot-based gestures. In Proceedings of the 30th International Conference on Human Factors in Computing Systems, CHI '12 (2012).
[2]
Anthony, L., Kim, Y., and Findlater, L. Analyzing user-generated youtube videos to understand touchscreen use by people with motor impairments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13, ACM (New York, NY, USA, 2013), 1223--1232.
[3]
Anthony, L., and Wobbrock, J. O. $n-protractor: A fast and accurate multistroke recognizer. In Proceedings of Graphics Interface 2012, GI '12, Canadian Information Processing Society (Toronto, Ont., Canada, Canada, 2012), 117--120.
[4]
Appert, C., and Zhai, S. Using strokes as command shortcuts: Cognitive benefits and toolkit support. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '09, ACM (New York, NY, USA, 2009), 2289--2298.
[5]
Bau, O., and Mackay, W. E. Octopocus: A dynamic guide for learning gesture-based command sets. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology, UIST '08, ACM (New York, NY, USA, 2008), 37--46.
[6]
Cockburn, A., Kristensson, P. O., Alexander, J., and Zhai, S. Hard lessons: Effort-inducing interfaces benefit spatial learning. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '07, ACM (New York, NY, USA, 2007), 1571--1580.
[7]
Epps, J., Lichman, S., and Wu, M. A study of hand shape use in tabletop gesture interaction. In CHI '06 Extended Abstracts on Human Factors in Computing Systems, CHI EA '06, ACM (New York, NY, USA, 2006), 748--753.
[8]
Findlater, L., Lee, B., and Wobbrock, J. Beyond QWERTY: Augmenting touch screen keyboards with multi-touch gestures for non-alphanumeric input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 2679--2682.
[9]
Frisch, M., Heydekorn, J., and Dachselt, R. Investigating multi-touch and pen gestures for diagram editing on interactive surfaces. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, ITS '09, ACM (New York, NY, USA, 2009), 149--156.
[10]
Goldberg, D., and Richardson, C. Touch-typing with a stylus. In Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems, CHI '93, ACM (1993), 80--87.
[11]
Hotelling, S., Strickon, J., Huppi, B., Chaudhri, I., Christie, G., Ording, B., Kerr, D., and Ive, J. Gestures for touch sensitive input devices, July 2 2013. US Patent 8,479,122.
[12]
Inselberg, A. Multidimensional detective. In Proceedings of the 1997 IEEE Symposium on Information Visualization (InfoVis '97), INFOVIS '97, IEEE Computer Society (Washington, DC, USA, 1997).
[13]
Jacques, J. J., and Kristensson, P. O. Crowdsourcing a hit: measuring workers' pre-task interactions on microtask markets. In Proceedings of the 1st AAAI Conference on Human Computation and Crowdsourcing, HCOMP '13, AAAI Press (Menlo Park, CA, USA, 2013), 86--93.
[14]
Jansen, E. Teaching users how to interact with gesture-based interfaces; a comparison of teaching-methods. diploma thesis, University of Technology Eindhoven, 2013.
[15]
Kin, K., Hartmann, B., DeRose, T., and Agrawala, M. Proton: Multitouch gestures as regular expressions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 2885--2894.
[16]
Kristensson, P. O., Nicholson, T., and Quigley, A. Continuous recognition of one-handed and two-handed gestures using 3D full-body motion tracking sensors. In Proceedings of the 2012 ACM International Conference on Intelligent User Interfaces, IUI '12, ACM (New York, NY, USA, 2012), 89--92.
[17]
Kristensson, P. O., and Zhai, S. SHARK2 : A large vocabulary shorthand writing system for pen-based computers. In Proceedings of the 17th Annual ACM Symposium on User Interface Software and Technology, UIST '04, ACM (2004), 43--52.
[18]
Kristensson, P. O., and Zhai, S. Command strokes with and without preview: Using pen gestures on keyboard for command selection. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '07, ACM (New York, NY, USA, 2007), 1137--1146.
[19]
Long, Jr., A. C., Landay, J. A., and Rowe, L. A. Implications for a gesture design tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '99, ACM (New York, NY, USA, 1999), 40--47.
[20]
Lü, H., and Li, Y. Gesture coder: A tool for programming multi-touch gestures by demonstration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 2875--2884.
[21]
Merrill, D. J., and Paradiso, J. A. Personalization, expressivity, and learnability of an implicit mapping strategy for physical interfaces. In In the Extended Abstracts of the Conference on Human Factors in Computing Systems (CHI05 (2005), 2152--2161.
[22]
Micire, M., Desai, M., Courtemanche, A., Tsui, K. M., and Yanco, H. A. Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces. In Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces, ITS '09, ACM (New York, NY, USA, 2009), 41--48.
[23]
Microsoft, Inc. Using gestures. http://bit.ly/OXOvT4, June 2014.
[24]
Morris, M. R., Danielescu, A., Drucker, S., Fisher, D., Lee, B., schraefel, m. c., and Wobbrock, J. O. Reducing legacy bias in gesture elicitation studies. interactions 21, 3 (May 2014), 40--45.
[25]
Morris, M. R., Wobbrock, J. O., and Wilson, A. D. Understanding users' preferences for surface gestures. In Proceedings of Graphics Interface 2010, GI '10, Canadian Information Processing Society (Toronto, Ont., Canada, Canada, 2010), 261--268.
[26]
Nacenta, M. A., Kamber, Y., Qiang, Y., and Kristensson, P. O. Memorability of pre-designed and userdefined gesture sets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13, ACM (New York, NY, USA, 2013), 1099--1108.
[27]
Ni, T., Bowman, D., and North, C. Airstroke: Bringing unistroke text entry to freehand gesture interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '11, ACM (New York, NY, USA, 2011), 2473--2476.
[28]
Nielsen, M., Störring, M., Moeslund, B., and Granum, E. A procedure for developing intuitive and ergonomic gesture interfaces for hci. In Gesture-Based Communication in Human-Computer Interaction, vol. 2915 of Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2004, 409--420.
[29]
Oh, U., and Findlater, L. The challenges and potential of end-user gesture customization. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '13, ACM (New York, NY, USA, 2013), 1129--1138.
[30]
Ouyang, T., and Li, Y. Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 2895--2904.
[31]
Pavlovic, V. I., Sharma, R., and Huang, T. S. Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 7 (July 1997), 677--695.
[32]
Piumsomboon, T., Clark, A., Billinghurst, M., and Cockburn, A. User-defined gestures for augmented reality. In CHI '13 Extended Abstracts on Human Factors in Computing Systems, CHI EA '13, ACM (New York, NY, USA, 2013), 955--960.
[33]
Quek, F. K. H. Toward a vision-based hand gesture interface. In Proceedings of the Conference on Virtual Reality Software and Technology, VRST '94, World Scientific Publishing Co., Inc. (River Edge, NJ, USA, 1994), 17--31.
[34]
Rovelo Ruiz, G. A., Vanacken, D., Luyten, K., Abad, F., and Camahort, E. Multi-viewer gesture-based interaction for omni-directional video. In Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems, CHI '14, ACM (New York, NY, USA, 2014), 4077--4086.
[35]
Ruiz, J., Li, Y., and Lank, E. User-defined motion gestures for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '11, ACM (New York, NY, USA, 2011), 197--206.
[36]
Schmidt, S., Nacenta, M. A., Dachselt, R., and Carpendale, S. A set of multi-touch graph interaction techniques. In ACM International Conference on Interactive Tabletops and Surfaces, ACM (2010), 113--116.
[37]
Seyed, T., Burns, C., Costa Sousa, M., Maurer, F., and Tang, A. Eliciting usable gestures for multi-display environments. In Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, ITS '12, ACM (New York, NY, USA, 2012), 41--50.
[38]
Sodhi, R., Benko, H., and Wilson, A. Lightguide: Projected visualizations for hand movement guidance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '12, ACM (New York, NY, USA, 2012), 179--188.
[39]
Vatavu, R.-D. User-defined gestures for free-hand tv control. In Proceedings of the 10th European Conference on Interactive Tv and Video, EuroiTV '12, ACM (2012), 45--48.
[40]
Vatavu, R. D., and Pentiuc, S. G. Multi-level representation of gesture as command for human computer interaction. Computing and Informatics 27, 6 (2008), 837--851.
[41]
von Laban, R. The mastery of movement. Dance Books, Binsted, Hampshire, UK, 2011.
[42]
Wilson, M., Mackay, W., Chi, E., Bernstein, M., and Nichols, J. Replichi sig: From a panel to a new submission venue for replication. In CHI '12 Extended Abstracts on Human Factors in Computing Systems, CHI EA '12, ACM (New York, NY, USA, 2012), 1185--1188.
[43]
Wobbrock, J. O., Aung, H. H., Rothrock, B., and Myers, B. A. Maximizing the guessability of symbolic input. In CHI '05 Extended Abstracts on Human Factors in Computing Systems, CHI EA '05, ACM (New York, NY, USA, 2005), 1869--1872.
[44]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI '09, ACM (New York, NY, USA, 2009), 1083--1092.
[45]
Wobbrock, J. O., Wilson, A. D., and Li, Y. Gestures without libraries, toolkits or training: A $1 recognizer for user interface prototypes. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology, UIST '07, ACM (2007), 159--168.
[46]
Zhai, S., Kristensson, P. O., Appert, C., Andersen, T. H., and Cao, X. Foundational issues in touch-surface stroke gesture design--an integrative review. Foundations and Trends in Human-Computer Interaction 5, 2 (2012), 97--205.

Cited By

View all
  • (2024)The State of Pilot Study Reporting in Crowdsourcing: A Reflection on Best Practices and GuidelinesProceedings of the ACM on Human-Computer Interaction10.1145/36410238:CSCW1(1-45)Online publication date: 26-Apr-2024
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2022)Theoretically-Defined vs. User-Defined Squeeze GesturesProceedings of the ACM on Human-Computer Interaction10.1145/35678056:ISS(73-102)Online publication date: 14-Nov-2022
  • Show More Cited By

Index Terms

  1. User-defined Interface Gestures: Dataset and Analysis

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ITS '14: Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces
    November 2014
    524 pages
    ISBN:9781450325875
    DOI:10.1145/2669485
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 November 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gesture analysis methodology
    2. gesture annotation
    3. gesture datasets
    4. gesture design
    5. gesture elicitation
    6. gesture memorability
    7. user-defined gestures

    Qualifiers

    • Research-article

    Conference

    ITS '14
    Sponsor:
    ITS '14: Interactive Tabletops and Surfaces
    November 16 - 19, 2014
    Dresden, Germany

    Acceptance Rates

    ITS '14 Paper Acceptance Rate 31 of 112 submissions, 28%;
    Overall Acceptance Rate 119 of 418 submissions, 28%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)40
    • Downloads (Last 6 weeks)6
    Reflects downloads up to 18 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)The State of Pilot Study Reporting in Crowdsourcing: A Reflection on Best Practices and GuidelinesProceedings of the ACM on Human-Computer Interaction10.1145/36410238:CSCW1(1-45)Online publication date: 26-Apr-2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2022)Theoretically-Defined vs. User-Defined Squeeze GesturesProceedings of the ACM on Human-Computer Interaction10.1145/35678056:ISS(73-102)Online publication date: 14-Nov-2022
    • (2022)QuantumLeap, a Framework for Engineering Gestural User Interfaces based on the Leap Motion ControllerProceedings of the ACM on Human-Computer Interaction10.1145/35322116:EICS(1-47)Online publication date: 17-Jun-2022
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)Hand Gesture Recognition for an Off-the-Shelf Radar by Electromagnetic Modeling and InversionProceedings of the 27th International Conference on Intelligent User Interfaces10.1145/3490099.3511107(506-522)Online publication date: 22-Mar-2022
    • (2022)Clarifying Agreement Calculations and Analysis for End-User Elicitation StudiesACM Transactions on Computer-Human Interaction10.1145/347610129:1(1-70)Online publication date: 7-Jan-2022
    • (2022)Learning End-User Customized Mid-Air Hand Gestures Using a Depth Image SensorIEEE Sensors Journal10.1109/JSEN.2022.319091322:17(16994-17004)Online publication date: 1-Sep-2022
    • (2022)Consistent, Continuous, and Customizable Mid-Air Gesture Interaction for Browsing Multimedia Objects on Large DisplaysInternational Journal of Human–Computer Interaction10.1080/10447318.2022.207846439:12(2492-2523)Online publication date: 27-Jul-2022
    • (2022)Memory load differentially influences younger and older users’ learning curve of touchscreen gesturesScientific Reports10.1038/s41598-022-15092-y12:1Online publication date: 25-Jun-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media