Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/2799250.2799258acmotherconferencesArticle/Chapter ViewAbstractPublication PagesautomotiveuiConference Proceedingsconference-collections
research-article

Light my way: visualizing shared gaze in the car

Published: 01 September 2015 Publication History

Abstract

In demanding driving situations, the front-seat passenger can become a supporter of the driver by, e.g., monitoring the scene or providing hints about upcoming hazards or turning points. A fast and efficient communication of such spatial information can help the driver to react properly, with more foresight. As shown in previous research, this spatial referencing can be facilitated by providing the driver a visualization of the front-seat passenger's gaze. In this paper, we focus on the question how the gaze should be visualized for the driver, taking into account the feasibility of implementation in a real car. We present the results from a driving simulator study, where we compared an LED visualization (glowing LEDs on an LED stripe mounted at the bottom of the windshield, indicating the horizontal position of the gaze) with a visualization of the gaze as a dot in the simulated environment. Our results show that LED visualization comes with benefits with regard to driver distraction but also bears disadvantages with regard to accuracy and control for the front-seat passenger.

References

[1]
Susan E. Brennan, Xin Chen, Christopher A. Dickinson, Mark B. Neider, and Gregory J. Zelinsky. 2008. Coordinating cognition: The costs and benefits of shared gaze during collaborative search. Cognition. 106, 3, 1465--1477.
[2]
Kelly J. Bryden, Judith Charlton, Jennifer Oxley, and Georgia Lowndes. 2014. Older driver and passenger collaboration for wayfinding in unfamiliar areas. International Journal of Behavioral Development. 38, 4, 378--385.
[3]
Encyclopedia Britannica.http://en.wikisource.org/wiki/1911_Encyclopædia_Britannica/Weber%27s_Law
[4]
Jodi Forlizzi, William C. Barley, and Thomas Seder. 2010. Where should I turn: moving from individual to collaborative navigation strategies to inform the interaction design of future navigation systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 1261--1270.
[5]
Magdalena Gärtner, Alexander Meschtscherjakov, Bernhard Maurer, David Wilfinger, and Manfred Tscheligi. 2014. "Dad, Stop Crashing My Car!": Making Use of Probing to Inspire the Design of Future In-Car Interfaces. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), Article 27, 8 pages. http://doi.acm.org/10.1145/2667317.2667348
[6]
Nicole Gridling, Alexander Meschtscherjakov, and Manfred Tscheligi. 2012. I need help!: exploring collaboration in the car. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work Companion (CSCW '12), 87--90. http://doi.acm.org/10.1145/2141512.2141549
[7]
Andreas Löcken, Heiko Müller, Wilko Heuten, and Susanne Boll. 2014. "Should I stay or should I go?": Different designs to support drivers' decision making. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational (NordiCHI '14), 1031--1034. http://doi.acm.org/10.1145/2639189.2670268
[8]
Stefan Mattes. 2003. The Lane Change Task as a Tool for Driver Distraction Evaluation. IHRA-ITS Workshop on Driving Simulator Scenarios, http://www.nrd.nhtsa.dot.gov/IHRA/ITS/MATTES.pdf
[9]
Bernhard Maurer, Sandra Trösterer, Magdalena Gärtner, Martin Wuchse, Axel Baumgartner, Alexander Meschtscherjakov, David Wilfinger, and Manfred Tscheligi. 2014. Shared Gaze in the Car: Towards a Better Driver-Passenger Collaboration. In Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '14), 1--6. http://doi.acm.org/10.1145/2667239.2667274
[10]
Phillip Mayring. 2004. Qualitative content analysis. In A companion to qualitative research, Uwe Flick, Ernst von Kardorff, and Ines Steinke (eds.). SAGE Publications Ltd, London, 266--269.
[11]
Mohammad M. Moniri, Michael Feld, and Christian Müller. 2012. Personalized In-Vehicle Information Systems: Building an Application Infrastructure for Smart Cars in Smart Spaces. In Proceedings of the 8th International Conference on Intelligent Environments, 379--382. DOI 10.1109/IE.2012.40
[12]
Mohammad Mehdi Moniri and Christian Müller. 2012. Multimodal reference resolution for mobile spatial interaction in urban environments. In Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '12). 241--248. http://doi.acm.org/10.1145/2390256.2390296
[13]
Mark B. Neider, Xin Chen, Christopher A. Dickinson, Susan E. Brennan, and Gregory J. Zelinsky. 2010. Coordinating spatial referencing using shared gaze. Psychonomic Bulletin & Review. 17, 5, 718--724.
[14]
Annie Pauzié. 2008. A method to assess the driver mental workload: The driving activity load index (DALI). IET Intelligent Transport Systems. 2, 4, 315--322.
[15]
Matthias Pfromm, Stephan Cieler, and Ralph Bruder. 2013. Driver assistance via optical information with spatial reference. In Proceedings of the 16th International IEEE Annual Conference on Intelligent Transportation Systems, 2006--2011.
[16]
Laura Pomarjanschi, Michael Dorr, and Erhardt Barth. 2012. Gaze guidance reduces the number of collisions with pedestrians in a driving simulator. ACM Trans. Interact. Intell. Syst. 1, 2, Article 8 (January 2012), 14 pages. http://doi.acm.org/10.1145/2070719.2070721
[17]
Laura Pomarjanschi, Michael Dorr, Peter J. Bex, and Erhardt Barth. 2013. Simple gaze-contingent cues guide eye movements in a realistic driving simulator. In Proceedings of SPIE, Human Vision and Electronic Imaging XVIII, vol. 8651.
[18]
Sandra Trösterer, Magdalena Gärtner, Martin Wuchse, Bernhard Maurer, Alexander Meschtscherjakov, and Manfred Tscheligi. 2015. Four Eyes See More Than Two: Shared Gaze in the Car. In Proceedings of International Conference on Human-Computer Interaction (INTERACT 2015).

Cited By

View all
  • (2023)Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI?Information10.3390/info1405026014:5(260)Online publication date: 27-Apr-2023
  • (2021)Passenger's State of Mind: Future Narratives for Semi-Autonomous Cars13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3473682.3478679(182-185)Online publication date: 9-Sep-2021
  • (2020)Increasing User Experience and Trust in Automated Vehicles via an Ambient Light Display22nd International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3379503.3403567(1-10)Online publication date: 5-Oct-2020
  • Show More Cited By

Index Terms

  1. Light my way: visualizing shared gaze in the car

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AutomotiveUI '15: Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
    September 2015
    338 pages
    ISBN:9781450337366
    DOI:10.1145/2799250
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 September 2015

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. LED
    2. collaboration
    3. driving
    4. shared gaze
    5. spatial referencing
    6. visualization

    Qualifiers

    • Research-article

    Conference

    AutomotiveUI '15

    Acceptance Rates

    AutomotiveUI '15 Paper Acceptance Rate 38 of 80 submissions, 48%;
    Overall Acceptance Rate 248 of 566 submissions, 44%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)19
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 31 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI?Information10.3390/info1405026014:5(260)Online publication date: 27-Apr-2023
    • (2021)Passenger's State of Mind: Future Narratives for Semi-Autonomous Cars13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3473682.3478679(182-185)Online publication date: 9-Sep-2021
    • (2020)Increasing User Experience and Trust in Automated Vehicles via an Ambient Light Display22nd International Conference on Human-Computer Interaction with Mobile Devices and Services10.1145/3379503.3403567(1-10)Online publication date: 5-Oct-2020
    • (2020)Effects of Depth Information on Visual Target Identification Task Performance in Shared Gaze EnvironmentsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2020.297305426:5(1934-1944)Online publication date: May-2020
    • (2020)Design for Luxury Front-Seat Passenger Infotainment Systems with Experience Prototyping through VRInternational Journal of Human–Computer Interaction10.1080/10447318.2020.178515036:18(1714-1733)Online publication date: 15-Jul-2020
    • (2019)Driving Together Across VehicleInternational Journal of Mobile Human Computer Interaction10.4018/IJMHCI.201904010411:2(58-74)Online publication date: 1-Apr-2019
    • (2019)From Manual Driving to Automated DrivingProceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3342197.3344529(70-90)Online publication date: 21-Sep-2019
    • (2019)Usability and UX of a Gaze Interaction Tool for Front Seat PassengersProceedings of Mensch und Computer 201910.1145/3340764.3344890(677-681)Online publication date: 8-Sep-2019
    • (2019)Shared Gaze While Driving: How Drivers Can Be Supported by an LED-Visualization of the Front-Seat Passenger’s GazeHuman-Computer Interaction – INTERACT 201910.1007/978-3-030-29384-0_21(329-350)Online publication date: 2-Sep-2019
    • (2018)Follow MeProceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3239060.3239088(176-187)Online publication date: 23-Sep-2018
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media