Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2470654.2481352acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Gesture output: eyes-free output using a force feedback touch surface

Published: 27 April 2013 Publication History

Abstract

We propose using spatial gestures not only for input but also for output. Analogous to gesture input, the proposed gesture output moves the user's finger in a gesture, which the user then recognizes. We use our concept in a mobile scenario where a motion path forming a "5" informs users about new emails, or a heart-shaped path serves as a mes- sage from a friend. We built two prototypes: (1) The long- RangeOuija is a stationary prototype that offers a motion range of up to 4cm; (2) The pocketOuija is self-contained mobile device based on an iPhone with up to 1cm motion range. Both devices actuate the user's fingers by means of an actuated transparent foil overlaid onto a touchscreen. We conducted three studies with the longRangeOuija in which participants recognized 2cm marks with 97% accu- racy, Graffiti digits with 98.8%, pairs of Graffiti digits with 90.5%, and Graffiti letters with 93.4%. Participants previ- ously unfamiliar with Graffiti identified 96.2% of digits and 76.4% of letters, suggesting that properly designed gesture output is guessable. After the experiment, the same participants were able to enter 100% of Graffiti digits by heart and 92.2% of letters. This suggests that participants learned gesture input as a side effect of using gesture output on our prototypes.

Supplementary Material

suppl.mov (chi0102-file3.mp4)
Supplemental video

References

[1]
Bragdon, A., Nelson, E. Li, Y., Hinckley, K. Experimental analysis of touch-screen gesture designs in mobile environments. CHI'11, 403--412.
[2]
Brave. S., Dahley, A. inTouch: a medium for haptic interpersonal communication. CHI EA'97, 363--364.
[3]
Brewster S., Brown, L.M. Tactons: structured tactile messages for non-visual information display. Auic'04, 15--23.
[4]
Chubb, E.C, Colgate, J. E., Peshkin, M. A. ShiverPaD: A Glass Haptic Surface That Produces Shear Force on a Bare Finger. Haptic'10, 189--198.
[5]
Dang, T., Annaswamy, T.M., Srinivasan, M.A. Development and evaluation of an epidural injection simulator with force feedback for medical training. Medicine Meets VR, 97--102.
[6]
Enriquez, M., MacLean, K. The hapticon editor: a tool in support ofhapticcommunication research. Haptic'03, 356362.
[7]
Feygin, D., Keehner, M., Tendick, F. Haptic Guidance: Experimental Evaluation of a Haptic Training Method for a Perceptual Motor Skill. Haptic'02. 40--47.
[8]
Geldard, F.A. 1960. Some neglected Possibilities of Communication, Science 131, 1583--1588.
[9]
Goldberg, K., Wallace, I. Denta Dentata, SIGGRAPH '93.
[10]
Israr, A. and Poupyrev, I. Tactile Brush: Drawing on Skin with Tactile Grid Display. CHI'11, 2019--2028.
[11]
Kurtenbach, G., Buxton, W. Issues in Combining Marking and Direct Manipulation Techniques. UIST'91, 137--144.
[12]
Li, K, A. Designing easily learnable eyes-free interaction, Ph.D. Thesis. 2009.
[13]
Li, K.A., Baudisch, P., Griswold, W.G., Hollan, J.D. Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors. UIST '08, 181--190.
[14]
MacKenzie S., Zhang, S.X. The immediate usability of graffiti. GI'97, 129--137.
[15]
Morris, D., Tan, H.Z., Barbagli, F., Chang, T., Salisbury, K., Haptic feedbackenhances force skill learning.WHC'07, 21--26.
[16]
Mullenbach, J., D. Johnson, J. E. Colgate, and M. A. Peshkin, ActivePaD surface haptic device", Haptic'02. 414, 2012.
[17]
Ni, T., Baudisch, P. Disappearing mobile devices. UIST'09, 101.
[18]
Noma, H. Miyasato, T., Kishino, F. A palmtop display for dextrous manipulation with haptic sensation. CHI'96, 126133.
[19]
Pangaro, G., Maynes-Aminzade, D., Ishii, H. The actuated workbench: computer-controlled actuation in tabletop tangible interfaces. UIST'02, 181--190.
[20]
Patten, J., Ishii, H. Mechanical constraints as computational constraints in tabletop tangible interfaces. CHI '07, 809--818.
[21]
Poupyrev, I., Maruyama, S., Rekimoto, J. Ambient Touch: Designing tactile interfaces for handhelds. UIST'02. 51--60.
[22]
Saponas, T.S., Harrison, C., Benko, H. PocketTouch: throughfabric capacitive touch input. UIST '11, 303--308.
[23]
Sato M., Development of String-based Force Display: SPIDAR. VSMM'02. 1034--1039.
[24]
Seo, J., Choi, S. Initial study for creating linearly moving vibrotactile sensation on mobile device. Haptic'10, 67--70.
[25]
Srimathveeravalli G., Thenkurussi, K.Motor Skill Training Assistance Using Haptic Attributes. WHC '05, 452--457.
[26]
Tan, H. Z., Durlach, N., Rabinowitz, W. Reed, C.M, Santos, J. Reception of Morse Code through Motional, Vibrotactile and Auditory Stimulation. Perception & Psychophysics, 59, 7, '97.
[27]
Teo, C., Burdet, E., Lim, H. A robotic teacher of Chinese handwriting. HAPTIC'02, 335--341.
[28]
Wang, D., Rossi, M., Tuer, K., Madill, D. Method and System for Providing Haptic Effects. US Patent 20060209037.
[29]
Wang, D., Tuer, K., Rossi, M., Shu, J. Haptic Overlay Device for Flat Panel Touch Displays. Demo at Haptic'04.
[30]
Weiss, M., Schwarz, F., Jakubowski, S., Borchers, J. Madgets: actuating widgets on interactive tabletops. UIST '10, 293--302.
[31]
Weiss, M., Wacharamanotham, C., Voelker, S., Borchers, J. FingerFlux: Near-Surface Haptic Feedback on Tabletops. UIST '11, 615--620.
[32]
White, N., Back D. Telephonic Arm Wrestling, Strategic Arts Initiative Symposium (Salerno, Italy, Spring 1986).
[33]
Williamson, J., Murray-Smith, R., Hughes, S. Shoogle: excitatory multimodal interaction on mobile devices. CHI'07,121124.
[34]
Winfield, L., Glassmire, J., Colgate, J. E. Peshkin. M. T-pad: Tactile pattern display through variable friction reduction. WHC'07, 421--426.
[35]
Wobbrock, J.O., Myers, B.A., Kembel, J. EdgeWrite: a stylusbased text entry method designed for high accuracy and stability of motion. UIST '03, 61--70.
[36]
Yang, Y., Zhang, Y., Hou, Z., Lemaire-Semail, B. FingViewer: A new multi-touch force feedback touch screen. ICMI'11, 837--838.
[37]
Yatani, K., Truong, K.N. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. UIST '09. ACM, 111--120.
[38]
Yokokohji, Y., Hollis, R. L., Kanade, T., Henmi, K., Yoshikawa, T. Toward machine mediated training of motor skillsskill transfer from human to human via virtual environment. In RO-MAN 1996, 32--37.

Cited By

View all
  • (2024)Enhancing Touch Circular Knob with Haptic Feedback when Performing Another Saturating Attention Primary TaskProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656656(1-9)Online publication date: 3-Jun-2024
  • (2023)FeetThrough: Electrotactile Foot Interface that Preserves Real-World SensationsProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606808(1-11)Online publication date: 29-Oct-2023
  • (2023)iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580928(1-17)Online publication date: 19-Apr-2023
  • Show More Cited By

Index Terms

  1. Gesture output: eyes-free output using a force feedback touch surface

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '13: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
    April 2013
    3550 pages
    ISBN:9781450318990
    DOI:10.1145/2470654
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 April 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. eyes free
    2. force feedback
    3. gestures
    4. touch

    Qualifiers

    • Research-article

    Conference

    CHI '13
    Sponsor:

    Acceptance Rates

    CHI '13 Paper Acceptance Rate 392 of 1,963 submissions, 20%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)42
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 28 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Enhancing Touch Circular Knob with Haptic Feedback when Performing Another Saturating Attention Primary TaskProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656656(1-9)Online publication date: 3-Jun-2024
    • (2023)FeetThrough: Electrotactile Foot Interface that Preserves Real-World SensationsProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606808(1-11)Online publication date: 29-Oct-2023
    • (2023)iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580928(1-17)Online publication date: 19-Apr-2023
    • (2023)LipIO: Enabling Lips as both Input and Output SurfaceProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580775(1-14)Online publication date: 19-Apr-2023
    • (2023)Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic FeedbackHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_31(479-500)Online publication date: 25-Aug-2023
    • (2023)Effects of Moving Speed and Phone Location on Eyes-Free Gesture Input with Mobile DevicesHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_30(469-478)Online publication date: 25-Aug-2023
    • (2022)SilentSpeller: Towards mobile, hands-free, silent speech text entry using electropalatographyProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502015(1-19)Online publication date: 29-Apr-2022
    • (2021)Multi-channel Tactile Feedback Based on User Finger SpeedProceedings of the ACM on Human-Computer Interaction10.1145/34885495:ISS(1-17)Online publication date: 5-Nov-2021
    • (2021)A Survey on Haptic Technologies for Mobile Augmented RealityACM Computing Surveys10.1145/346539654:9(1-35)Online publication date: 8-Oct-2021
    • (2021)Effect of Attention Saturating and Cognitive Load on Tactile Texture Recognition for Mobile SurfaceHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85610-6_31(557-579)Online publication date: 26-Aug-2021
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media