Nothing Special   »   [go: up one dir, main page]

Skip to main content

Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2023 (INTERACT 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14142))

Included in the following conference series:

  • 1153 Accesses

Abstract

Smartphones are used in different contexts, including scenarios where visual and auditory modalities are limited (e.g., walking or driving). In this context, we introduce a new interaction concept, called Hap2Gest, that can give commands and retrieve information, both eyes-free. First, it uses a gesture as input for command invocation, and then output information is retrieved using haptic feedback perceived through an output gesture drawn by the user. We conducted an elicitation study with 12 participants to determine users’ preferences for the aforementioned gestures and the vibration patterns for 25 referents. Our findings indicate that users tend to use the same gesture for input and output, and there is a clear relationship between the type of gestures and vibration patterns users suggest and the type of output information. We show that the gesture’s speed profile agreement rate is significantly higher than the gesture’s shape agreement rate, and it can be used by the recognizer when the gesture shape agreement rate is low. Finally, we present a complete set of user-defined gestures and vibration patterns and address the gesture recognition problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bau, O., Poupyrev, I., Israr, A., Harrison, C.: Teslatouch: electrovibration for touch surfaces. In: Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology, UIST 2010, pp. 283–292. Association for Computing Machinery, New York, NY, USA (2010). https://doi.org/10.1145/1866029.1866074

  2. Bernard, C., Monnoyer, J., Ystad, S., Wiertlewski, M.: Eyes-off your fingers: gradual surface haptic feedback improves eyes-free touchscreen interaction. In: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. CHI 2022, Association for Computing Machinery, New York, NY, USA (2022). https://doi.org/10.1145/3491102.3501872

  3. Van den Bogaert, L., Geerts, D.: User-defined mid-air haptic sensations for interacting with an AR menu environment. In: Nisky, I., Hartcher-O’Brien, J., Wiertlewski, M., Smeets, J. (eds.) EuroHaptics 2020. LNCS, vol. 12272, pp. 25–32. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58147-3_3

    Chapter  Google Scholar 

  4. Brewster, S., Brown, L.M.: Tactons: structured tactile messages for non-visual information display. In: Proceedings of the Fifth Conference on Australasian User Interface, AUIC 2004, vol. 28. pp. 15–23. Australian Computer Society Inc, AUS (2004)

    Google Scholar 

  5. Brewster, S., Lumsden, J., Bell, M., Hall, M., Tasker, S.: Multimodal ’eyes-free’ interaction techniques for wearable devices. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2003, pp. 473–480. Association for Computing Machinery, New York, NY, USA (2003). https://doi.org/10.1145/642611.642694

  6. Cauchard, J.R., Cheng, J.L., Pietrzak, T., Landay, J.A.: Activibe: design and evaluation of vibrations for progress monitoring. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI 2016, pp. 3261–3271. Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/2858036.2858046

  7. Chen, Q., Perrault, S.T., Roy, Q., Wyse, L.: Effect of temporality, physical activity and cognitive load on spatiotemporal vibrotactile pattern recognition. In: Proceedings of the 2018 International Conference on Advanced Visual Interfaces, AVI 2018, Association for Computing Machinery, New York, NY, USA (2018). https://doi.org/10.1145/3206505.3206511

  8. Cockburn, A., Woolley, D., Thai, K.T.P., Clucas, D., Hoermann, S., Gutwin, C.: Reducing the attentional demands of in-vehicle touchscreens with stencil overlays. In: Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, pp. 33–42. AutomotiveUI 2018, Association for Computing Machinery, New York, NY, USA (2018). https://doi.org/10.1145/3239060.3239061

  9. Ghosh, D., Liu, C., Zhao, S., Hara, K.: Commanding and re-dictation: developing eyes-free voice-based interaction for editing dictated text. ACM Trans. Comput.-Hum. Interact. (TOCHI) 27(4), 1–31 (2020)

    Article  Google Scholar 

  10. Guettaf, A., Rekik, Y., Grisoni, L.: Effect of physical challenging activity on tactile texture recognition for mobile surface. Proc. ACM Hum.-Comput. Interact. 4(ISS) (2020). https://doi.org/10.1145/3427318

  11. Guettaf, A., Rekik, Y., Grisoni, L.: Effect of attention saturating and cognitive load on tactile texture recognition for mobile surface. In: Ardito, C., et al. (eds.) INTERACT 2021, Part IV. LNCS, vol. 12935, pp. 557–579. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-85610-6_31

    Chapter  Google Scholar 

  12. Kajastila, R., Lokki, T.: Eyes-free interaction with free-hand gestures and auditory menus. Int. J. Hum.-Comput. Stud. 71(5), 627–640 (2013)

    Article  Google Scholar 

  13. Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: Understanding preference and performance. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2011, pp. 413–422. Association for Computing Machinery, New York, NY, USA (2011). https://doi.org/10.1145/1978942.1979001

  14. Kappers, A.M., Plaisier, M.A.: Hands-free devices for displaying speech and language in the tactile modality—methods and approaches. IEEE Trans. Haptics 14(3), 465–478 (2021)

    Google Scholar 

  15. Kim, L.H., Follmer, S.: Swarmhaptics: haptic display with swarm robots. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, pp. 1–13. Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3290605.3300918

  16. Levesque, V., et al.: Enhancing physicality in touch interaction with programmable friction. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2011, pp. 2481–2490. Association for Computing Machinery, New York, NY, USA (2011). https://doi.org/10.1145/1978942.1979306

  17. Marino, D., de Vargas, M.F., Weill-Duflos, A., Cooperstock, J.R.: Conversing using whatshap: a phoneme based vibrotactile messaging platform. In: 2021 IEEE World Haptics Conference (WHC), pp. 943–948 (2021). https://doi.org/10.1109/WHC49131.2021.9517186

  18. Morris, M.R.: Web on the wall: insights from a multimodal interaction elicitation study. In: Proceedings of the 2012 ACM International Conference on Interactive Tabletops and Surfaces, ITS 2012, pp. 95–104. Association for Computing Machinery, New York, NY, USA (2012). https://doi.org/10.1145/2396636.2396651

  19. Morris, M.R., et al.: Reducing legacy bias in gesture elicitation studies. Interactions 21(3), 40–45 (2014)

    Google Scholar 

  20. Nacenta, M.A., Kamber, Y., Qiang, Y., Kristensson, P.O.: Memorability of pre-designed and user-defined gesture sets. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2013, pp. 1099–1108. Association for Computing Machinery, New York, NY, USA (2013). https://doi.org/10.1145/2470654.2466142

  21. Negulescu, M., Ruiz, J., Li, Y., Lank, E.: Tap, swipe, or move: attentional demands for distracted smartphone input. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, AVI 2012, pp. 173–180. ACM, New York, NY, USA (2012). https://doi.org/10.1145/2254556.2254589

  22. Novich, S.D., Eagleman, D.M.: Using space and time to encode vibrotactile information: toward an estimate of the skin’s achievable throughput. Exp. Brain Rese. 233(10), 2777–2788 (2015)

    Google Scholar 

  23. Perrault, S.T., Lecolinet, E., Eagan, J., Guiard, Y.: Watchit: simple gestures and eyes-free interaction for wristwatches and bracelets. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2013, pp. 1451–1460. Association for Computing Machinery, New York, NY, USA (2013). https://doi.org/10.1145/2470654.2466192

  24. Peshkova, E., Hitz, M., Ahlström, D.: Exploring user-defined gestures and voice commands to control an unmanned aerial vehicle. In: Poppe, R., Meyer, J.-J., Veltkamp, R., Dastani, M. (eds.) INTETAIN 2016 2016. LNICST, vol. 178, pp. 47–62. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-49616-0_5

    Chapter  Google Scholar 

  25. Piumsomboon, T., Clark, A., Billinghurst, M., Cockburn, A.: User-defined gestures for augmented reality. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems, CHI EA 2013, pp. 955–960. Association for Computing Machinery, New York, NY, USA (2013). https://doi.org/10.1145/2468356.2468527

  26. Rekik, Y., Lank, E., Guettaf, A., Grisoni, L.: Multi-channel tactile feedback based on user finger speed. In: Proceedings of the ACM on Human-Computer Interaction, vol. 5(ISS) November 2021. https://doi.org/10.1145/3488549

  27. Rekik, Y., Vezzoli, E., Grisoni, L., Giraud, F.: Localized haptic texture: a rendering technique based on taxels for high density tactile feedback. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI 2017, pp. 5006–5015. ACM, New York, NY, USA (2017). https://doi.org/10.1145/3025453.3026010

  28. Roudaut, A., Rau, A., Sterz, C., Plauth, M., Lopes, P., Baudisch, P.: Gesture output: eyes-free output using a force feedback touch surface. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2013, pp. 2547–2556. Association for Computing Machinery, New York, NY, USA (2013). https://doi.org/10.1145/2470654.2481352

  29. Scott, J., Gray, R.: A comparison of tactile, visual, and auditory warnings for rear-end collision prevention in simulated driving. Hum. Factors 50(2), 264–275 (2008)

    Article  Google Scholar 

  30. Sharma, A., Roo, J.S., Steimle, J.: Grasping microgestures: eliciting single-hand microgestures for handheld objects. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, CHI 2019, pp. 1–13. Association for Computing Machinery, New York, NY, USA (2019). https://doi.org/10.1145/3290605.3300632

  31. Tan, H.Z., et al.: Acquisition of 500 English words through a tactile phonemic sleeve (taps). IEEE Trans. Haptics 13(4), 745–760 (2020)

    Article  Google Scholar 

  32. Troiano, G.M., Pedersen, E.W., Hornbæk, K.: User-defined gestures for elastic, deformable displays. In: Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces, AVI 2014, pp. 1–8. Association for Computing Machinery, New York, NY, USA (2014). https://doi.org/10.1145/2598153.2598184

  33. Tsandilas, T.: Fallacies of agreement: a critical review of consensus assessment methods for gesture elicitation. ACM Trans. Comput.-Hum. Interact. (TOCHI) 25(3), 1–49 (2018)

    Article  Google Scholar 

  34. Tung, Y.C., et al.: User-defined game input for smart glasses in public space. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 3327–3336. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702123.2702214

  35. Vatavu, R.D., Wobbrock, J.O.: Formalizing agreement analysis for elicitation studies: new measures, significance test, and toolkit. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 1325–1334. Association for Computing Machinery, New York, NY, USA (2015). https://doi.org/10.1145/2702123.2702223

  36. Vatavu, R.D., Wobbrock, J.O.: Clarifying agreement calculations and analysis for end-user elicitation studies. ACM Trans. Comput.-Hum. Interact. (TOCHI) 29(1), 1–70 (2022)

    Article  Google Scholar 

  37. Vezzoli, E., Sednaoui, T., Amberg, M., Giraud, F., Lemaire-Semail, B.: Texture rendering strategies with a high fidelity - capacitive visual-haptic friction control device. In: Bello, F., Kajimoto, H., Visell, Y. (eds.) EuroHaptics 2016. LNCS, vol. 9774, pp. 251–260. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-42321-0_23

    Chapter  Google Scholar 

  38. Wei, Q., Li, M., Hu, J., Feijs, L.: Creating mediated touch gestures with vibrotactile stimuli for smart phones. In: Proceedings of the Fourteenth International Conference on Tangible, Embedded, and Embodied Interaction, TEI 2020, pp. 519–526. Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3374920.3374981

  39. Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2009, pp. 1083–1092. Association for Computing Machinery, New York, NY, USA (2009). https://doi.org/10.1145/1518701.1518866

  40. Zhao, S., Israr, A., Klatzky, R.: Intermanual apparent tactile motion on handheld tablets. In: 2015 IEEE World Haptics Conference (WHC), pp. 241–247 (2015). https://doi.org/10.1109/WHC.2015.7177720

  41. Zhao, S., Israr, A., Lau, F., Abnousi, F.: Coding tactile symbols for phonemic communication. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1–13 (2018)

    Google Scholar 

Download references

Acknowledgements

This project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No. 860114.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Milad Jamalzadeh .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jamalzadeh, M., Rekik, Y., Dancu, A., Grisoni, L. (2023). Hap2Gest: An Eyes-Free Interaction Concept with Smartphones Using Gestures and Haptic Feedback. In: Abdelnour Nocera, J., Kristín Lárusdóttir, M., Petrie, H., Piccinno, A., Winckler, M. (eds) Human-Computer Interaction – INTERACT 2023. INTERACT 2023. Lecture Notes in Computer Science, vol 14142. Springer, Cham. https://doi.org/10.1007/978-3-031-42280-5_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-42280-5_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-42279-9

  • Online ISBN: 978-3-031-42280-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics