Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2390256.2390282acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Multimodal interaction in the car: combining speech and gestures on the steering wheel

Published: 17 October 2012 Publication History

Abstract

Implementing controls in the car becomes a major challenge: The use of simple physical buttons does not scale to the increased number of assistive, comfort, and infotainment functions. Current solutions include hierarchical menus and multi-functional control devices, which increase complexity and visual demand. Another option is speech control, which is not widely accepted, as it does not support visibility of actions, fine-grained feedback, and easy undo of actions. Our approach combines speech and gestures. By using speech for identification of functions, we exploit the visibility of objects in the car (e.g., mirror) and simple access to a wide range of functions equaling a very broad menu. Using gestures for manipulation (e.g., left/right), we provide fine-grained control with immediate feedback and easy undo of actions. In a user-centered process, we determined a set of user-defined gestures as well as common voice commands. For a prototype, we linked this to a car interior and driving simulator. In a study with 16 participants, we explored the impact of this form of multimodal interaction on the driving performance against a baseline using physical buttons. The results indicate that the use of speech and gesture is slower than using buttons but results in a similar driving performance. Users comment in a DALI questionnaire that the visual demand is lower when using speech and gestures.

References

[1]
I. Alvarez, A. Martin, J. Dunbar, J. Taiber, D.-M. Wilson, and J. E. Gilbert. Designing driver-centric natural voice user interfaces. In Adj. Proc. AutomotiveUI '11, pages 42--49. ACM, 2011.
[2]
A. Bangor, P. Kortum, and J. Miller. Determining what individual sus scores mean: Adding an adjective rating scale. Journal of Usability Studies, 4(3):114--123, May 2009.
[3]
R. A. Bolt. "put-that-there": Voice and gesture at the graphics interface. In Proc. SIGGRAPH '80, pages 262--270, 1980.
[4]
J. Brooke. Sus: A quick and dirty usability scale. In Usability evaluation in industry. Taylor and Francis, London, 1996.
[5]
T. Döring, D. Kern, P. Marshall, M. Pfeiffer, J. Schöning, V. Gruhn, and A. Schmidt. Gestural interaction on the steering wheel: reducing the visual demand. In Proc. CHI '11, pages 483--492. ACM, 2011.
[6]
C. Endres, T. Schwartz, and C. A. Müller. Geremin": 2D microgestures for drivers based on electric field sensing. In Proc. IUI '11, pages 327--330. ACM, 2011.
[7]
G. Geiser. Man Machine Interaction in Vehicles. ATZ, 87:74--77, 1985.
[8]
I. E. González, J. O. Wobbrock, D. H. Chau, A. Faulring, and B. A. Myers. Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels. In Proc. GI '07, pages 95--102. ACM, 2007.
[9]
A. Goulati and D. Szostak. User experience in speech recognition of navigation devices: an assessment. In Proc. MobileHCI '11, pages 517--520. ACM, 2011.
[10]
P. Holleis and A. Schmidt. Makeit: Integrate user interaction times in the design process of mobile applications. In Pervasive Computing, volume 5013, pages 56--74. Springer Berlin/Heidelberg, 2008.
[11]
D. Kern, A. Mahr, S. Castronovo, A. Schmidt, and C. Müller. Making use of drivers' glances onto the screen for explicit gaze-based interaction. In Proc. AutomotiveUI '10, pages 110--116. ACM, 2010.
[12]
D. Kern and A. Schmidt. Design space for driver-based automotive user interfaces. In Proc. AutomotiveUI '09, pages 3--10. ACM, 2009.
[13]
D. Kern and S. Schneegass. CARS - configurable automotive research simulator. i-com, 8(2):30--33, 2009.
[14]
S. Mattes. The lane-change-task as a tool for driver distraction evaluation. Most, pages 1--5, 2003.
[15]
C. Müller and G. Weinberg. Multimodal input in the car, today and tomorrow. Multimedia, IEEE, 18(1):98--103, Jan. 2011.
[16]
S. Oviatt. The human-computer interaction handbook. chapter Multimodal interfaces, pages 286--304. L. Erlbaum Associates Inc., Hillsdale, NJ, USA, 2003.
[17]
A. Pauzie. A method to assess the driver mental workload: The driving activity load index (dali). Humanist, 2(April):315--322, 2008.
[18]
C. Pickering, K. Burnham, and M. Richardson. A review of automotive human machine interface technologies and techniques to reduce driver distraction. In 2nd IET Conf. on System Safety, pages 223--228, Oct. 2007.
[19]
L. M. Reeves, J. Lai, J. A. Larson, S. Oviatt, T. S. Balaji, S. Buisine, P. Collings, P. Cohen, B. Kraal, J.-C. Martin, M. McTear, T. Raman, K. M. Stanney, H. Su, and Q. Y. Wang. Guidelines for multimodal user interface design. Commun. ACM, 47(1):57--59, Jan. 2004.
[20]
S. Schneegass, B. Pfleging, D. Kern, and A. Schmidt. Support for modeling interaction with in-vehicle interfaces. In Proc. Automotive UI '11, pages 3--10. ACM, 2011.
[21]
R. Sharma, M. Yeasin, N. Krahnstoever, I. Rauschert, G. Cai, I. Brewer, A. MacEachren, and K. Sengupta. Speech-gesture driven multimodal interfaces for crisis management. Proc. IEEE, 91(9):1327--1354, Sept. 2003.
[22]
R. Spies, A. Blattner, C. Lange, M. Wohlfarter, K. Bengler, and W. Hamberger. Measurement of driver's distraction for an early prove of concepts in automotive industry at the example of the development of a haptic touchpad. In Proc. HCII '11, pages 125--132. Springer-Verlag, 2011.
[23]
U. Winter, T. J. Grost, and O. Tsimhoni. Language pattern analysis for automotive natural language speech applications. In Proc. AutomotiveUI '10, pages 34--41. ACM, 2010.
[24]
J. O. Wobbrock, M. R. Morris, and A. D. Wilson. User-defined gestures for surface computing. In Proc. CHI '09, pages 1083--1092. ACM, 2009.

Cited By

View all
  • (2024)Intelligent Cockpits for Connected Vehicles: Taxonomy, Architecture, Interaction Technologies, and Future DirectionsSensors10.3390/s2416517224:16(5172)Online publication date: 10-Aug-2024
  • (2024)Towards a Conceptual Model of Users’ Expectations of an Autonomous In-Vehicle Multimodal ExperienceHuman Behavior and Emerging Technologies10.1155/2024/74185972024(1-14)Online publication date: 14-Mar-2024
  • (2024)Interaction with a 3D Surface for an Innovative Input Experience on a Central ConsoleProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675707(136-148)Online publication date: 22-Sep-2024
  • Show More Cited By

Index Terms

  1. Multimodal interaction in the car: combining speech and gestures on the steering wheel

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AutomotiveUI '12: Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
    October 2012
    280 pages
    ISBN:9781450317511
    DOI:10.1145/2390256
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 17 October 2012

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. automotive user interfaces
    2. gesture interaction
    3. multimodal interfaces
    4. speech interaction

    Qualifiers

    • Research-article

    Conference

    AutomotiveUI '12

    Acceptance Rates

    Overall Acceptance Rate 248 of 566 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)152
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 16 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Intelligent Cockpits for Connected Vehicles: Taxonomy, Architecture, Interaction Technologies, and Future DirectionsSensors10.3390/s2416517224:16(5172)Online publication date: 10-Aug-2024
    • (2024)Towards a Conceptual Model of Users’ Expectations of an Autonomous In-Vehicle Multimodal ExperienceHuman Behavior and Emerging Technologies10.1155/2024/74185972024(1-14)Online publication date: 14-Mar-2024
    • (2024)Interaction with a 3D Surface for an Innovative Input Experience on a Central ConsoleProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675707(136-148)Online publication date: 22-Sep-2024
    • (2024)Steering Towards Safety: Evaluating Signaling Gestures for an Embodied Driver GuideProceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3640792.3675703(363-373)Online publication date: 22-Sep-2024
    • (2024)What Challenges Does the Full-Touch HMI Mode Bring to Driver's Lateral Control Ability? A Comparative Study Based on Real RoadsIEEE Transactions on Human-Machine Systems10.1109/THMS.2023.334204554:1(21-33)Online publication date: Feb-2024
    • (2024)Preference in Voice Commands and Gesture Controls With Hands-Free Augmented Reality With Novel UsersIEEE Pervasive Computing10.1109/MPRV.2024.336454123:1(18-26)Online publication date: Jan-2024
    • (2024)Aircraft human‐machine interaction assistant design: A novel multimodal data processing and application frameworkIET Control Theory & Applications10.1049/cth2.12754Online publication date: 28-Oct-2024
    • (2024)Smart solutions in car dashboard interfaces as a response to needs of drivers and their assessmentCase Studies on Transport Policy10.1016/j.cstp.2024.10119416(101194)Online publication date: Jun-2024
    • (2024)Research on the Correlation of User Evaluation on the Interaction Features Between the Central-Control-Screen and Voice in the Automobile CockpitHCI International 2024 Posters10.1007/978-3-031-61963-2_32(322-328)Online publication date: 8-Jun-2024
    • (2023)A Qualitative Study on the Expectations and Concerns Around Voice and Gesture Interactions in VehiclesProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596040(2155-2171)Online publication date: 10-Jul-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media