Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2984511.2984563acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

WristWhirl: One-handed Continuous Smartwatch Input using Wrist Gestures

Published: 16 October 2016 Publication History

Abstract

We propose and study a new input modality, WristWhirl, that uses the wrist as an always-available joystick to perform one-handed continuous input on smartwatches. We explore the influence of the wrist's bio-mechanical properties for performing gestures to interact with a smartwatch, both while standing still and walking. Through a user study, we examine the impact of performing 8 distinct gestures (4 directional marks, and 4 free-form shapes) on the stability of the watch surface. Participants were able to perform directional marks using the wrist as a joystick at an average rate of half a second and free-form shapes at an average rate of approximately 1.5secs. The free-form shapes could be recognized by a $1 gesture recognizer with an accuracy of 93.8% and by three human inspectors with an accuracy of 85%. From these results, we designed and implemented a proof-of-concept device by augmenting the watchband using an array of proximity sensors, which can be used to draw gestures with high quality. Finally, we demonstrate a number of scenarios that benefit from one-handed continuous input on smartwatches using WristWhirl.

Supplementary Material

suppl.mov (uist3668-file3.mp4)
Supplemental video
MP4 File (p861-gong.mp4)

References

[1]
Aria Wearable, https://www.ariawearable.com/
[2]
Myo Gesture Control Armband, https://www.myo.com/
[3]
VICON Motion Tracking System, http://www.vicon.com/
[4]
Amento, B., Hill, W., and Terveen, L. 2002. The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface. In CHI '02 Extended Abstracts on Human Factors in Computing Systems (CHI EA '02). ACM, New York, NY, USA, 724--725. DOI=http://dx.doi.org/10.1145/506443.506566
[5]
Borg, G. 1970. Perceived exertion as an indicator of somatic stress. Scand j Rehabil Med 2(3), 92--98.
[6]
Bragdon, A., Nelson, E., Li, Y., and Hinckley, K. 2011. Experimental analysis of touch-screen gesture designs in mobile environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 403412. DOI=http://dx.doi.org/10.1145/1978942.1979000
[7]
Chan, L., Chen, Y., Hsieh, C., Liang, R., and Chen, B. 2015. CyclopsRing: Enabling Whole-Hand and Context-Aware Interactions Through a Fisheye Ring. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 549--556. DOI=http://dx.doi.org/10.1145/2807442.2807450
[8]
Crossan, A., and Murray-Smith, R. 2004. Variability in Wrist-Tilt Accelerometer Based Gesture Interfaces. In Proceedings of the 6th international conference on Human computer interaction with mobile devices and services (MobileHCI '04). ACM, Glasgow, Scotland, 144--155.
[9]
Crossan, A., Williamson, J., Brewster, S., and MurraySmith, R. 2008. Wrist rotation for interaction in mobile contexts. In Proceedings of the 10th international conference on Human computer interaction with mobile devices and services (MobileHCI '08). ACM, New York, NY, USA, 435--438. DOI=http://dx.doi.org/10.1145/1409240.1409307
[10]
Dementyev, A., and Paradiso, J.A. 2014. WristFlex: low-power gesture input with wrist-worn pressure sensors. In Proceedings of the 27th annual ACM symposium on User interface software and technology (UIST '14). ACM, New York, NY, USA, 161--166. DOI=10.1145/2642918.2647396 http://doi.acm.org/10.1145/2642918.2647396
[11]
Fukui, R., Watanabe, M., Gyota, T., Shimosaka, M., and Sato, T. 2011. Hand shape classification with a wrist contour sensor: development of a prototype device. In Proceedings of the 13th international conference on Ubiquitous computing (UbiComp '11). ACM, New York, NY, USA, 311--314. DOI=http://dx.doi.org/10.1145/2030112.2030154
[12]
Grandjean, E. 1969. Fitting the task to the man: an ergonomic approach Taylor and Francis. 372.
[13]
Harrison, C., Tan, D., and Morris, D. 2010. Skinput: appropriating the body as an input surface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). ACM, New York, NY, USA, 453--462. DOI=http://dx.doi.org/10.1145/1753326.1753394
[14]
Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E. 2000. Sensing techniques for mobile interaction. In Proceedings of the 13th annual ACM symposium on User interface software and technology (UIST '00). ACM, New York, NY, USA, 91--100. DOI=http://dx.doi.org/10.1145/354401.354417
[15]
Ion, A., Wang, E.J., and Baudisch, P. 2015. Skin Drag Displays: Dragging a Physical Tactor across the User's Skin Produces a Stronger Tactile Stimulus than Vibrotactile. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 25012504. DOI=http://dx.doi.org/10.1145/2702123.2702459
[16]
Kerber, F., Krüger, A., and Löchtefeld, M. 2014. Investigating the effectiveness of peephole interaction for smartwatches in a map navigation task. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services (MobileHCI '14). ACM, New York, NY, USA, 291--294. DOI=http://dx.doi.org/10.1145/2628363.2628393
[17]
Kerber, F., Lessel, P., and Krüger, A. 2015. Same-side Hand Interactions with Arm-placed Devices Using EMG. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 1367--1372. DOI=http://dx.doi.org/10.1145/2702613.2732895
[18]
Kim, D., Hilliges, O., Izadi, S., Butler, A.D., Chen, J., Oikonomidis, I., and Olivier, P. 2012. Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor. In Proceedings of the 25th annual ACM symposium on User interface software and technology (UIST '12). ACM, New York, NY, USA, 167--176. DOI=http://dx.doi.org/10.1145/2380116.2380139
[19]
Li, Y. 2010. Gesture search: a tool for fast mobile data access. In Proceedings of the 23nd annual ACM symposium on User interface software and technology (UIST '10). ACM, New York, NY, USA, 87--96. DOI=http://dx.doi.org/10.1145/1866029.1866044
[20]
Lien, J., Gillian, N., Karagozler, M.E., Amihood, P., Schwesig, C., Olson, E., Raja, H., and Poupyrev, I. 2016. Soli: Ubiquitous Gesture Sensing with Millimeter Wave Radar. In Proceedings of the 43rd ACM Special Interest Group on Computer Graphics and Interactive Techniques (SIGGRAPH'16). ACM, Anaheim, CA, USA. DOI=http://dx.doi.org/10.1145/2897824.2925953
[21]
Loclair, C., Gustafson, S., and Baudisch, P. 2010. PinchWatch: a wearable device for one-handed microinteractions. In Proc. MobileHCI Workshop on Ensembles of On-Body Devices. Lisbon, Portugal, 4 pages.
[22]
Ortega-Avila, S., Rakova, B., Sadi, S., and Mistry, P. 2015. Non-invasive optical detection of hand gestures. In Proceedings of the 6th Augmented Human International Conference (AH '15). ACM, New York, NY, USA, 179--180. DOI=http://dx.doi.org/10.1145/2735711.2735801
[23]
Rahman, M., Gustafson, S., Irani, P., and Subramanian, S. 2009. Tilt techniques: investigating the dexterity of wrist-based input. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 1943--1952. DOI=http://dx.doi.org/10.1145/1518701.1518997
[24]
Rekimoto, J. 2001. GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices. In Proceedings of the 5th IEEE International Symposium on Wearable Computers (ISWC '01). IEEE Computer Society, Washington, DC, USA, 21--.
[25]
Ruiz, J., and Li, Y. 2011. DoubleFlip: a motion gesture delimiter for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, New York, NY, USA, 27172720. DOI=http://dx.doi.org/10.1145/1978942.1979341
[26]
Saponas, T.S., Tan, D.S., Morris, D., Balakrishnan, R., Turner, J., and Landay, J.A. 2009. Enabling alwaysavailable input with muscle-computer interfaces. In Proceedings of the 22nd annual ACM symposium on User interface software and technology (UIST '09). ACM, New York, NY, USA, 167--176. DOI=http://dx.doi.org/10.1145/1622176.1622208
[27]
Strohmeier, P., Vertegaal, R., and Girouard, A. 2012. With a flick of the wrist: stretch sensors as lightweight input for mobile devices. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction (TEI '12), Stephen N. Spencer (Ed.). ACM, New York, NY, USA, 307--308. DOI=http://dx.doi.org/10.1145/2148131.2148195
[28]
Voyles, R., Bae, J., and Godzdanker, R. 2008. The gestural joystick and the efficacy of the path tortuosity metric for human/robot interaction. In Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems (PerMIS '08). ACM, New York, NY, USA, 91--97. DOI=http://dx.doi.org/10.1145/1774674.1774689
[29]
Way, D. and Paradiso, J. 2014. A Usability User Study Concerning Free-Hand Microgesture and Wrist-Worn Sensors. In Proceedings of the 2014 11th International Conference on Wearable and Implantable Body Sensor Networks (BSN '14). IEEE Computer Society, Washington, DC, USA, 138--142. DOI=http://dx.doi.org/10.1109/BSN.2014.32
[30]
Weberg, L., Brange, T. and Hansson, Å.W. 2001. A piece of butter on the PDA display. In CHI '01 Extended Abstracts on Human Factors in Computing Systems (CHI EA '01). ACM, New York, NY, USA, 435--436. DOI=http://dx.doi.org/10.1145/634067.634320
[31]
Wobbrock, J.O., Wilson, A.D., and Li, Y. 2007. Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes. In Proceedings of the 20th annual ACM symposium on User interface software and technology (UIST '07). ACM, New York, NY, USA, 159--168. DOI=http://dx.doi.org/10.1145/1294211.1294238
[32]
Wobbrock, J.O., Morris, M.R., and Wilson, A.D. 2009. User-defined gestures for surface computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 1083--1092. DOI=http://dx.doi.org/10.1145/1518701.1518866
[33]
Wolf, K., Naumann, A., Rohs, M., and Muller, J. 2011. Taxonomy of microinteractions: defining microgestures based on ergonomic and scenariodependent requirements. In Proceedings of the 13th IFIP TC 13 international conference on Humancomputer interaction (INTERACT '11). Lisbon, Portugal, 559--575.
[34]
Xiao, R., Laput, G., and Harrison, C. 2014. Expanding the input expressivity of smartwatches with mechanical pan, twist, tilt and click. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 193--196. DOI=http://dx.doi.org/10.1145/2556288.2557017
[35]
Yee, K. 2003. Peephole displays: pen interaction on spatially aware handheld computers. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). ACM, New York, NY, USA, 1--8. DOI=http://dx.doi.org/10.1145/642611.642613
[36]
Zhang, Y., and Harrison, C. 2015. Tomo: Wearable, Low-Cost Electrical Impedance Tomography for Hand Gesture Recognition. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 167--173. DOI=http://dx.doi.org/10.1145/2807442.2807480

Cited By

View all
  • (2024)Thumb-to-Finger Gesture Recognition Using COTS Smartwatch AccelerometersProceedings of the International Conference on Mobile and Ubiquitous Multimedia10.1145/3701571.3701600(184-195)Online publication date: 1-Dec-2024
  • (2024)GestureGPT: Toward Zero-Shot Free-Form Hand Gesture Understanding with Large Language Model AgentsProceedings of the ACM on Human-Computer Interaction10.1145/36981458:ISS(462-499)Online publication date: 24-Oct-2024
  • (2024)Exploring the Affordances of Bio-Electronic NailsCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3681952(360-365)Online publication date: 5-Oct-2024
  • Show More Cited By

Index Terms

  1. WristWhirl: One-handed Continuous Smartwatch Input using Wrist Gestures

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and Technology
    October 2016
    908 pages
    ISBN:9781450341899
    DOI:10.1145/2984511
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 October 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. continuous input
    2. gestural input.
    3. one-handed interaction
    4. smartwatch
    5. smartwatch input

    Qualifiers

    • Research-article

    Conference

    UIST '16

    Acceptance Rates

    UIST '16 Paper Acceptance Rate 79 of 384 submissions, 21%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)152
    • Downloads (Last 6 weeks)15
    Reflects downloads up to 30 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Thumb-to-Finger Gesture Recognition Using COTS Smartwatch AccelerometersProceedings of the International Conference on Mobile and Ubiquitous Multimedia10.1145/3701571.3701600(184-195)Online publication date: 1-Dec-2024
    • (2024)GestureGPT: Toward Zero-Shot Free-Form Hand Gesture Understanding with Large Language Model AgentsProceedings of the ACM on Human-Computer Interaction10.1145/36981458:ISS(462-499)Online publication date: 24-Oct-2024
    • (2024)Exploring the Affordances of Bio-Electronic NailsCompanion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing10.1145/3675094.3681952(360-365)Online publication date: 5-Oct-2024
    • (2024)Vision-Based Hand Gesture Customization from a Single DemonstrationProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676378(1-14)Online publication date: 13-Oct-2024
    • (2024)EchoWrist: Continuous Hand Pose Tracking and Hand-Object Interaction Recognition Using Low-Power Active Acoustic Sensing On a WristbandProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642910(1-21)Online publication date: 11-May-2024
    • (2024)Hand Gesture Recognition for Blind Users by Tracking 3D Gesture TrajectoryProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642602(1-15)Online publication date: 11-May-2024
    • (2024)A Meta-Bayesian Approach for Rapid Online Parametric Optimization for Wrist-based InteractionsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642071(1-38)Online publication date: 11-May-2024
    • (2024)Discrete Gesture Recognition Using Multimodal PPG, IMU, and Single-Channel EMG Recorded at the WristIEEE Sensors Letters10.1109/LSENS.2024.34472408:9(1-4)Online publication date: Sep-2024
    • (2024)A Character Input Method for Smart Glasses that Allows You to Enter One Character in One Step with One ThumbHuman-Computer Interaction10.1007/978-3-031-60449-2_14(203-214)Online publication date: 1-Jun-2024
    • (2023)HeadarProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36109007:3(1-28)Online publication date: 27-Sep-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media