Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3205873.3205874acmconferencesArticle/Chapter ViewAbstractPublication PagesperdisConference Proceedingsconference-collections
research-article

Exploring Vibrotactile and Peripheral Cues for Spatial Attention Guidance

Published: 06 June 2018 Publication History

Abstract

For decision making in monitoring and control rooms situation awareness is key. Given the often spacious and complex environments, simple alarms are not sufficient for attention guidance (e.g., on ship bridges). In our work, we explore shifting attention towards the location of relevant entities in large cyber-physical systems. Therefore, we used pervasive displays: tactile displays on both upper arms and a peripheral display. With these displays, we investigated shifting the attention in a seated and standing scenario. In a first user study, we evaluated four distinct cue patterns for each on-body display. We tested seated monitoring limited to 90° in front of the user. In a second study, we continued with the two patterns from the first study for lowest and highest urgency perceived. Here, we investigated standing monitoring in a 360° environment. We found that tactile cues led to faster arousal times than visual cues, whereas the attention shift speed for visual cues was faster than tactile cues.

References

[1]
Daniel L. Ashbrook, James R. Clawson, Kent Lyons, Thad E. Starner, and Nirmal Patel. 2008. Quickdraw: The Impact of Mobility and On-body Placement on Device Access Time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 219--222.
[2]
Thomas Booth, Srinivas Sridharan, Ann McNamara, Cindy Grimm, and Reynold Bailey. 2013. Guiding Attention in Controlled Real-world Environments. In Proceedings of the ACM Symposium on Applied Perception (SAP '13). ACM, New York, NY, USA, 75--82.
[3]
J K Bowmaker and H J Dartnall. 1980. Visual pigments of rods and cones in a human retina. The Journal of Physiology 298, 1 (1980), 501--511.
[4]
John Brooke and others. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996), 4--7.
[5]
Enrico Costanza, Samuel A. Inverso, Elan Pavlov, Rebecca Allen, and Pattie Maes. 2006. Eye-q: Eyeglass Peripheral Display for Subtle Intimate Notifications. In Proceedings of the 8th Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI '06). ACM, New York, NY, USA, 211--218.
[6]
F. Danieau, A. Guillo, and R. Doré. 2017. Attention guidance for immersive video content in head-mounted displays. In 2017 IEEE Virtual Reality (VR). 205--206.
[7]
David Dobbelstein, Philipp Henzler, and Enrico Rukzio. 2016. Unconstrained Pedestrian Navigation Based on Vibrotactile Feedback Around the Wristband of a Smartwatch. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 2439--2445.
[8]
Mica R Endsley. 1995. Toward a theory of situation awareness in dynamic systems. Human Factors: The Journal of the Human Factors and Ergonomics Society 37, 1 (1995), 32--64.
[9]
Eva Geisberger and Manfred Broy. 2015. Living in a networked world: Integrated research agenda Cyber-Physical Systems (agendaCPS). Herbert Utz Verlag.
[10]
Carl Gutwin, Andy Cockburn, and Ashley Coveney. 2017. Peripheral Popout: The Influence of Visual Angle and Stimulus Intensity on Popout Effects. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 208--219.
[11]
Chris Harrison, Brian Y. Lim, Aubrey Shick, and Scott E. Hudson. 2009. Where to Locate Wearable Displays?: Reaction Time Performance of Visual Alerts from Tip to Toe. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 941--944.
[12]
Sandra G. Hart. 2006. Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50, 9 (2006), 904--908.
[13]
Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile Wayfinder: A Non-visual Support System for Wayfinding. In Proceedings of the 5th Nordic Conference on Human-computer Interaction: Building Bridges (NordiCHI '08). ACM, New York, NY, USA, 172--181.
[14]
Oliver Beren Kaul and Michael Rohs. 2017. HapticHead: A Spherical Vibro-tactile Grid Around the Head for 3D Guidance in Virtual and Augmented Reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 3729--3740.
[15]
Wayne K Kirchner. 1958. Age differences in short-term retention of rapidly changing information. Journal of experimental psychology 55, 4 (1958), 352.
[16]
Andreas Löcken, Sarah Blum, Tim Claudius Stratmann, Uwe Gruenefeld, Wilko Heuten, Susanne Boll, and Steven van de Par. 2017. Effects of Location and Fade-in Time of (Audio-)Visual Cues on Response Times and Success-rates in a Dual-task Experiment. In Proceedings of the ACM Symposium on Applied Perception (SAP '17). ACM, New York, NY, USA, Article 12, 4 pages.
[17]
Andrés Lucero and Akos Vetek. 2014. NotifEye: Using Interactive Glasses to Deal with Notifications While Walking in Public. In Proceedings of the 11th Conference on Advances in Computer Entertainment Technology (ACE '14). ACM, New York, NY, USA, Article 17, 10 pages.
[18]
Kent Lyons. 2016. Visual Parameters Impacting Reaction Times on Smartwatches. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ' 16). ACM, New York, NY, USA, 190--194.
[19]
Benjamin Poppinga, Niels Henze, Jutta Fortmann, Wilko Heuten, and Susanne Boll. 2012. AmbiGlasses --Information in the Periphery of the Visual Field. In Mensch & Computer 2012: 12. fachübergreifende Konferenz für interaktive und kooperative Medien. interaktiv informiert -- allgegenwärtig und allumfassend!? 153--162.
[20]
Michael I Posner. 1980. Orienting of attention. Quarterly journal of experimental psychology 32, 1 (1980), 3--25.
[21]
Michael I. Posner and Steven E. Petersen. 1990. The Attention System of the Human Brain. Annual Review of Neuroscience 13, 1 (1990), 25--42.
[22]
Maria Rauschenberger, Andrii Matviienko, Vanessa Cobus, Janko Timmermann, Heiko Müller, Andreas Löcken, Jutta Fortmann, Christoph Trappe, Wilko Heuten, and Susanne Boll. 2015. Lumicons: Mapping Light Patterns to Information Classes. In Mensch und Computer 2015 -- Proceedings, Sarah Diefenbach, Niels Henze, and Martin Pielot (Eds.). De Gruyter Oldenbourg, Berlin, 343--346.
[23]
P. Renner and T. Pfeiffer. 2017. Attention guiding techniques using peripheral vision and eye tracking for feedback in augmented-reality-based assistance systems. In 2017 IEEE Symposium on 3D User Interfaces (3DUI). 186--194.
[24]
Tim Claudius Stratmann and Susanne Boll. 2016. Demon Hunt-The Role of Endsley's Demons of Situation Awareness in Maritime Accidents. In International Conference on Human-Centred Software Engineering. Springer, 203--212.
[25]
Robert Tscharn, Nam Ly-Tung, Diana Löffler, and Jörn Hurtienne. 2016. Ambient Light As Spatial Attention Guidance in Indoor Environments. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp '16). ACM, New York, NY, USA, 1627--1630.
[26]
Koji Tsukada and Michiaki Yasumura. 2004. ActiveBelt: Belt-Type Wearable Tactile Display for Directional Navigation. Springer Berlin Heidelberg, Berlin, Heidelberg, 384--399.

Cited By

View all
  • (2023)AI-to-Human ActuationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35808127:1(1-32)Online publication date: 28-Mar-2023
  • (2022)Two-Step Gaze GuidanceProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556612(299-309)Online publication date: 7-Nov-2022
  • (2021)Displays for Productive Non-Driving Related Tasks: Visual Behavior and Its Impact in Conditionally Automated DrivingMultimodal Technologies and Interaction10.3390/mti50400215:4(21)Online publication date: 18-Apr-2021
  • Show More Cited By

Index Terms

  1. Exploring Vibrotactile and Peripheral Cues for Spatial Attention Guidance

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    PerDis '18: Proceedings of the 7th ACM International Symposium on Pervasive Displays
    June 2018
    197 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 June 2018

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Attention Guidance
    2. Light
    3. Peripheral Vision
    4. Vibro-tactile

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    PerDis '18
    Sponsor:

    Acceptance Rates

    PerDis '18 Paper Acceptance Rate 22 of 36 submissions, 61%;
    Overall Acceptance Rate 213 of 384 submissions, 55%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)33
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 23 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)AI-to-Human ActuationProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35808127:1(1-32)Online publication date: 28-Mar-2023
    • (2022)Two-Step Gaze GuidanceProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556612(299-309)Online publication date: 7-Nov-2022
    • (2021)Displays for Productive Non-Driving Related Tasks: Visual Behavior and Its Impact in Conditionally Automated DrivingMultimodal Technologies and Interaction10.3390/mti50400215:4(21)Online publication date: 18-Apr-2021
    • (2021)Subtle Attention Guidance for Real Walking in Virtual Environments2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct54149.2021.00070(310-315)Online publication date: Oct-2021
    • (2020)VibroMapProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34321894:4(1-16)Online publication date: 18-Dec-2020
    • (2020)How to Reduce the Effort: Comfortable Watching Techniques for Cinematic Virtual RealityAugmented Reality, Virtual Reality, and Computer Graphics10.1007/978-3-030-58465-8_1(3-21)Online publication date: 31-Aug-2020
    • (2019)Guidance in Cinematic Virtual Reality-Taxonomy, Research Status and ChallengesMultimodal Technologies and Interaction10.3390/mti30100193:1(19)Online publication date: 19-Mar-2019
    • (2019)VibrAidProceedings of the 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments10.1145/3316782.3321531(138-145)Online publication date: 5-Jun-2019
    • (2018)Ecological Invitation to Engage with Public DisplaysProceedings of the 7th ACM International Symposium on Pervasive Displays10.1145/3205873.3210704(1-2)Online publication date: 6-Jun-2018

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media