Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3010915.3010955acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
short-paper

Designing a user-defined gesture vocabulary for an in-vehicle climate control system

Published: 29 November 2016 Publication History

Abstract

Hand gestures are a suitable interface medium for in-vehicle interfaces. They are intuitive and natural to perform, and less visually demanding while driving. This paper aims at analysing human gestures to define a preliminary gesture vocabulary for in-vehicle climate control using a driving simulator. We conducted a user-elicitation experiment on 22 participants performing two driving scenarios with different levels of cognitive load. The participants were filmed while performing natural gestures for manipulating the air-conditioning inside the vehicle. Comparisons are drawn between the proposed approach to define a vocabulary using 9 new gestures (GestDrive) and previously suggested methods. The outcomes demonstrate that GestDrive is successful in describing the employed gestures in detail.

References

[1]
Choi, J.-M. (2013). Multi-touch based standard UI design of car navigation system for providing information of surrounding areas. Design, User Experience, and Usability. User Experience in Novel Technological Environments, Springer: 40--48.
[2]
Dingus, T. A., M. C. Hulse, J. F. Antin and W. W. Wierwille (1989). "Attentional demand requirements of an automobile moving-map navigation system." Transportation Research Part A: General 23(4): 301--315.
[3]
Döring, T., D. Kern, P. Marshall, M. Pfeiffer, J. Schöning, V. Gruhn and A. Schmidt (2011). Gestural interaction on the steering wheel: reducing the visual demand. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM.
[4]
Drews, F. A., H. Yazdani, C. N. Godfrey, J. M. Cooper and D. L. Strayer (2009). "Text messaging during simulated driving." Human Factors: The Journal of the Human Factors and Ergonomics Society.
[5]
Gregoriades, A., A. Sutcliffe, G. Papageorgiou and P. Louvieris (2010). "Human-centered safety analysis of prospective road designs." Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on 40(2): 236--250.
[6]
Ha, T., M. Billinghurst and W. Woo (2012). "An interactive 3D movement path manipulation method in an augmented reality environment." Interacting with Computers 24(1): 10--24.
[7]
Horswill, M. S. and F. P. McKenna (1999). "The effect of interference on dynamic risk taking judgments." British Journal of Psychology 90(2): 189--199.
[8]
Jæger, M. G., M. B. Skov and N. G. Thomassen (2008). You can touch, but you can't look: interacting with in-vehicle systems. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM.
[9]
Jamson, A. H., S. J. Westerman, G. R. J. Hockey and O. M. Carsten (2004). "Speech-based e-mail and driver behavior: Effects of an in-vehicle message system interface." Human Factors: The Journal of the Human Factors and Ergonomics Society 46(4): 625--639.
[10]
Kauer, M., M. Schreiber and R. Bruder (2010). How to conduct a car? A design example for maneuver based driver-vehicle interaction. Intelligent Vehicles Symposium (IV), 2010 IEEE, IEEE.
[11]
LTD, F. C. (2009). "UC-win/Road product information." Retrieved May.
[12]
Metz, B., A. Landau and M. Just (2014). "Frequency of secondary tasks in driving-Results from naturalistic driving data." Safety science 68: 195--203.
[13]
Mohamed, M., B. Shamsul, R. Rahman, M. Aini and N. A. A. Jalil (2016). "Integrating Usability in Automotive Navigation User Interface Design via Kansei Engineering." Modern Applied Science 10(7): 208.
[14]
Normark, C. J., P. Tretten and A. Gärling (2009). Do redundant head-up and head-down display configurations cause distractions. Proceedings of the 5th International Driving Symposium on Human Factors in Driver Assessment and Design.
[15]
Obaid, M., M. Häring, F. Kistler, R. Bühling and E. André (2012). User-defined body gestures for navigational control of a humanoid robot. Social Robotics, Springer: 367--377.
[16]
Rakauskas, M. E., L. J. Gugerty and N. J. Ward (2004). "Effects of naturalistic cell phone conversations on driving performance." Journal of safety research 35(4): 453--464.
[17]
Riener, A., A. Ferscha, F. Bachmair, P. Hagmüller, A. Lemme, D. Muttenthaler, D. Pühringer, H. Rogner, A. Tappe and F. Weger (2013). Standardization of the in-car gesture interaction space. Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, ACM.
[18]
Ruiz, J., Y. Li and E. Lank (2011). User-defined motion gestures for mobile interaction. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM.
[19]
Silpasuwanchai, C. and X. Ren (2015). "Designing concurrent full-body gestures for intense gameplay." International Journal of Human-Computer Studies 80: 1--13.
[20]
Wobbrock, J. O., M. R. Morris and A. D. Wilson (2009). User-defined gestures for surface computing. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM.
[21]
Yale, S. H., P. Hansotia, D. Knapp and J. Ehrfurth (2003). "Neurologic conditions: assessing medical fitness to drive." Clinical medicine & research 1(3): 177--188.

Cited By

View all
  • (2024)Looking for a better fit? An Incremental Learning Multimodal Object Referencing Framework adapting to Individual DriversProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645152(1-13)Online publication date: 18-Mar-2024
  • (2024)Do users desire gestures for in-vehicle interaction? Towards the subjective assessment of gestures in a high-fidelity driving simulatorComputers in Human Behavior10.1016/j.chb.2024.108189156:COnline publication date: 9-Jul-2024
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • Show More Cited By

Index Terms

  1. Designing a user-defined gesture vocabulary for an in-vehicle climate control system

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    OzCHI '16: Proceedings of the 28th Australian Conference on Computer-Human Interaction
    November 2016
    706 pages
    ISBN:9781450346184
    DOI:10.1145/3010915
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • IEEE-SMCS: Systems, Man & Cybernetics Society
    • Australian Comp Soc: Australian Computer Society
    • Data61: Data61, CSIRO
    • ICACHI: International Chinese Association of Computer Human Interaction
    • Infoxchange: Infoxchange
    • HITLab AU: Human Interface Technology Laboratory Australia
    • James Boag: James Boag
    • Tourism Tasmania: Tourism Tasmania
    • HFESA: Human Factors and Ergonomics Society of Australia Inc.
    • IEEEVIC: IEEE Victorian Section
    • UTAS: University of Tasmania, Australia

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 29 November 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. driving simulator
    2. gestural interface
    3. gesture recognition
    4. in-vehicle interface
    5. user-elicitation

    Qualifiers

    • Short-paper

    Conference

    OzCHI '16
    Sponsor:
    • IEEE-SMCS
    • Australian Comp Soc
    • Data61
    • ICACHI
    • Infoxchange
    • HITLab AU
    • James Boag
    • Tourism Tasmania
    • HFESA
    • IEEEVIC
    • UTAS
    OzCHI '16: The 28th Australian Conference on Human-Computer Interaction
    November 29 - December 2, 2016
    Tasmania, Launceston, Australia

    Acceptance Rates

    Overall Acceptance Rate 362 of 729 submissions, 50%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)21
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 24 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Looking for a better fit? An Incremental Learning Multimodal Object Referencing Framework adapting to Individual DriversProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645152(1-13)Online publication date: 18-Mar-2024
    • (2024)Do users desire gestures for in-vehicle interaction? Towards the subjective assessment of gestures in a high-fidelity driving simulatorComputers in Human Behavior10.1016/j.chb.2024.108189156:COnline publication date: 9-Jul-2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)It’s all about you: Personalized in-Vehicle Gesture Recognition with a Time-of-Flight CameraProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607153(234-243)Online publication date: 18-Sep-2023
    • (2023)In-vehicle Performance and Distraction for Midair and Touch Directional GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581335(1-13)Online publication date: 19-Apr-2023
    • (2023)Multimodal Interaction in Virtual Reality: Assessing User Experience of Gesture- and Gaze-Based InteractionHCI International 2023 Posters10.1007/978-3-031-35989-7_73(578-585)Online publication date: 9-Jul-2023
    • (2023)Cueing Car Drivers with Ultrasound Skin StimulationHCI in Mobility, Transport, and Automotive Systems10.1007/978-3-031-35908-8_16(224-244)Online publication date: 23-Jul-2023
    • (2022)Gesture and Voice Commands to Interact With AR Windshield Display in Automated Vehicle: A Remote Elicitation StudyProceedings of the 14th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3543174.3545257(171-182)Online publication date: 17-Sep-2022
    • (2022)Shoes++Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346206:2(1-29)Online publication date: 7-Jul-2022
    • (2022)Method of Developing Hand Gesture Sets for Vehicle Entertainment and Ambiance Controls2022 Joint 12th International Conference on Soft Computing and Intelligent Systems and 23rd International Symposium on Advanced Intelligent Systems (SCIS&ISIS)10.1109/SCISISIS55246.2022.10002029(1-6)Online publication date: 29-Nov-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media