Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2516540.2516556acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Free-hand pointing for identification and interaction with distant objects

Published: 28 October 2013 Publication History

Abstract

In this paper, we investigate pointing as a lightweight form of gestural interaction in cars. In a pre-study, we show the technical feasibility of reliable pointing detection with a depth camera by achieving a recognition rate of 96% in the lab. In a subsequent in-situ study, we let drivers point to objects inside and outside of the car while driving through a city. In three usage scenarios, we studied how this influenced their driving objectively, as well as subjectively. Distraction from the driving task was compensated by a regulation of driving speed and did not have a negative influence on driving behaviour. Our participants considered pointing a desirable interaction technique in comparison to current controller-based interaction and identified a number of additional promising use cases for pointing in the car.

References

[1]
Althoff, F., Lindl, R., and Walchshäusl, L. Robust Multimodal Hand- and Gesture Recognition for Controlling Automotive Infotainment Systems. In Proc. VDI: Fahrer im 21. Jh., (2005).
[2]
Baudel, T. and Beaudouin-Lafon, M. CHARADE: remote control of objects using freehand gestures. Comm. ACM, 36(7): 28--35, 1993.
[3]
Bolt, R. A. 'Put that there': Voice and Gesture at the Graphics Interface. In Proc. SIGGRAPH 1980, ACM (1980), 262--270.
[4]
Burgin, W., Pantofaru, C., Smart, W. D. Using depth information to improve face detection. In Proc. HRI 2011, ACM (2011), 119--120.
[5]
Cao X. and Balakrishnan, R. Vision Wand: interaction techniques for large displays using a passive wand tracked in 3D. In Proc. UIST 2003, ACM (2003), 173--182.
[6]
Droeschel, D., Stückler, J., and Behnke, S. Learning to interpret pointing gestures with a time-of-flight camera. In Proc. HRI 2011, ACM (2011), 481.
[7]
Garstka, J. and Peters, G. View-dependent 3D Projection using Depth-Image-based Head Tracking. In Proc. Workshop on Projector-Camera Systems, IEEE (2011), 52--58.
[8]
Heidemann G., Bax I., and Bekel, H. Multimodal interaction in an augmented reality scenario. In Proc. ICMI 2004, ACM (2004), 53--60.
[9]
Holzapfel, H., Nickel, K., and Stiefelhagen, R. Implementation and evaluation of a constraint-based multimodal fusion system for speech and 3d pointing gestures. In Proc. ICMI 2004, ACM (2004), 175--182.
[10]
Hassenzahl, M. and Monk, A. The Inference of Perceived Usability From Beauty. Human-Computer Interaction, 25(3): 235--260, 2010.
[11]
Kortum, P. HCI Beyond the GUI: Design for Haptic, Speech, Olfactory, and Other Nontraditional Interfaces. Morgan Kaufmann Publishers Inc., San Francisco, CA USA, 2008.
[12]
Mahr, A., Endres, C., Müller, C., and Schneeberger, T. Determining Human-Centered Parameters of Ergonomic Micro-Gesture Interaction for Drivers Using the Theater Approach. In Proc. AutoUI 2011, ACM (2011), 151--157.
[13]
Mine, M. Virtual environment interaction techniques. UNC Chapel Hill Computer Science Technical Report TR95-018. 1995.
[14]
National Highway Traffic Safety Administration. Visual-Manual NHTSA Driver Distraction Guidelines. Feb. 2012.
[15]
Nickel, K. and Stiefelhagen, R. Pointing gesture recognition based on 3d-tracking of face, hands and head orientation. In Proc. ICMI 2003, ACM (2003), 140--146.
[16]
Park, C.-B. and Lee, S.-W. Real-time 3d pointing gesture recognition for mobile robots with cascade hmm and particle filter. Image and Vision Computing, 29(1): 51--63, 2011.
[17]
Richarz, J., Martin, C., Scheidig, A., and Gross H.-M. There You Go! - Estimating Pointing Gestures In Monocular Images For Mobile Robot Instruction. In Proc. ROMAN 2006, IEEE Computer Society (2006), 546--551.
[18]
Riener, A. Gestural Interaction in Vehicular Applications. Computer, 45 (4): 42--47, 2012.
[19]
Robinson, S., Eslambolchilar, P., and Jones, M. Evaluating haptics for information discovery while walking. In Proc. BCS HCI 2009, ACM (2009), 93--102.
[20]
Rümelin, S., Hardy, R., and Rukzio, E. NaviRadar: A Tactile Information Display for Pedestrian Navigation. In Proc. UIST 2011, ACM (2011), 293--302.
[21]
Wong, N. and Gutwin, C. Where are you pointing? In Proc. CHI 2010, ACM (2010), 1029--1036.
[22]
Zhao, Y., Chakraborty, A., Hong, K. W., Kakaraddi, S., and St. Amant, R. Pointing at responsive objects outdoors. In Proc. IUI 2012, ACM (2012), 281--284.

Cited By

View all
  • (2024)Looking for a better fit? An Incremental Learning Multimodal Object Referencing Framework adapting to Individual DriversProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645152(1-13)Online publication date: 18-Mar-2024
  • (2023)Effects of Urgency and Cognitive Load on Interaction in Highly Automated VehiclesProceedings of the ACM on Human-Computer Interaction10.1145/36042547:MHCI(1-20)Online publication date: 13-Sep-2023
  • (2023)It’s all about you: Personalized in-Vehicle Gesture Recognition with a Time-of-Flight CameraProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607153(234-243)Online publication date: 18-Sep-2023
  • Show More Cited By

Index Terms

  1. Free-hand pointing for identification and interaction with distant objects

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AutomotiveUI '13: Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
    October 2013
    281 pages
    ISBN:9781450324786
    DOI:10.1145/2516540
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    • Eindhoven University of Technology Department of Industrial Design: Eindhoven University of Technology, Department of Industrial Design

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 October 2013

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. camera-based tracking
    2. gesture interaction
    3. pointing

    Qualifiers

    • Research-article

    Conference

    AutomotiveUI '13
    Sponsor:
    • Eindhoven University of Technology Department of Industrial Design

    Acceptance Rates

    AutomotiveUI '13 Paper Acceptance Rate 41 of 67 submissions, 61%;
    Overall Acceptance Rate 248 of 566 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)45
    • Downloads (Last 6 weeks)15
    Reflects downloads up to 28 Sep 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Looking for a better fit? An Incremental Learning Multimodal Object Referencing Framework adapting to Individual DriversProceedings of the 29th International Conference on Intelligent User Interfaces10.1145/3640543.3645152(1-13)Online publication date: 18-Mar-2024
    • (2023)Effects of Urgency and Cognitive Load on Interaction in Highly Automated VehiclesProceedings of the ACM on Human-Computer Interaction10.1145/36042547:MHCI(1-20)Online publication date: 13-Sep-2023
    • (2023)It’s all about you: Personalized in-Vehicle Gesture Recognition with a Time-of-Flight CameraProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607153(234-243)Online publication date: 18-Sep-2023
    • (2023)Assessing Augmented Reality Selection Techniques for Passengers in Moving Vehicles: A Real-World User StudyProceedings of the 15th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3580585.3607152(22-31)Online publication date: 18-Sep-2023
    • (2023)Towards Adaptive User-centered Neuro-symbolic Learning for Multimodal Interaction with Autonomous SystemsProceedings of the 25th International Conference on Multimodal Interaction10.1145/3577190.3616121(689-694)Online publication date: 9-Oct-2023
    • (2023)A Multi-type Classifier Ensemble for Detecting Fake Reviews Through Textual-based Feature ExtractionACM Transactions on Internet Technology10.1145/356867623:1(1-24)Online publication date: 5-Apr-2023
    • (2023)Binsec/Rel: Symbolic Binary Analyzer for Security with Applications to Constant-Time and Secret-ErasureACM Transactions on Privacy and Security10.1145/356303726:2(1-42)Online publication date: 14-Apr-2023
    • (2023)User-Device-Interaction Model: A Multimodal Interaction Εvaluation System Based on Analytic Hierarchy ProcessHuman-Computer Interaction10.1007/978-3-031-35596-7_13(181-199)Online publication date: 9-Jul-2023
    • (2022)A Systematic Evaluation of Solutions for the Final 100m Challenge of Highly Automated VehiclesProceedings of the ACM on Human-Computer Interaction10.1145/35467136:MHCI(1-19)Online publication date: 20-Sep-2022
    • (2022)Adaptive User-Centered Multimodal Interaction towards Reliable and Trusted Automotive InterfacesProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3557034(690-695)Online publication date: 7-Nov-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media