Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 2021
The sound a robot or automated system makes and the sounds it listens for in our shared acoustic ... more The sound a robot or automated system makes and the sounds it listens for in our shared acoustic environment can greatly expand its contextual understanding and to shape its behaviors to the interactions it is trying to perform. People convey significant information with sound in interpersonal communication in social contexts. Para-linguistic information about where we are, how loud we're speaking, or if we sound happy, sad or upset are relevant to understand for a robot that looks to adapt its interactions to be socially appropriate. Similarly, the qualities of the sound an object makes can change how people perceive that object and can alter whether or not it attracts attention, interrupts other interactions, reinforces or contradicts an emotional expression, and as such should be aligned with the designer's intention for the object. In this tutorial, we will introduce the participants to software and design methods to help robots recognize and generate sound for human-robot interaction (HRI). Using open-source tools and methods designers can apply to their own robots, we seek to increase the application of sound to robot design and stimulate HRI research in robot sound.
Automatic doors exemplify the challenges of designing emotionally welcoming interactive systems—a... more Automatic doors exemplify the challenges of designing emotionally welcoming interactive systems—a critical issue in the design of any system of incidental use. We attempt to broaden the automatic door’s repertoire of signals by examining how people respond to a variety of “door gestures ” designed to offer different levels of approachability. In a pilot study, participants (N=48) who walked past a physical gesturing door were asked to fill out a questionnaire about that experience. In our follow-up study, participants (N=51) viewed 12 video clips depicting a person walking toward and past an automatic door that moved with different speeds and trajectories. In both studies, our Likert-scale measures and open-ended responses indicated significant uniformity in participants ’ interpretation of the behaviour of the door prototypes. The participants saw these motions as gestures with human-like characteristics such as cognition and intent. Our work suggests that even in non-anthropomorph...
Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2018
How will pedestrians from different regions interact with an approaching autonomous vehicle? Unde... more How will pedestrians from different regions interact with an approaching autonomous vehicle? Understanding differences in pedestrian culture and responses can help inform autonomous cars how to behave appropriately in different regional contexts. We conducted a field study comparing the behavioral response of pedestrians between metropolitan Mexico City (N=113) and Colima, a smaller coastal city (N=81). We hid a driver in a car seat costume as a Wizard-of-Oz prototype to evoke pedestrian interaction behavior at a crosswalk or street. Pedestrian interactions were coded for crossing decision, crossing pathway, pacing, and observational behavior. Most distinctly, pedestrians in Mexico City kept their pace and more often crossed in front of the vehicle, while those in Colima stopped in front of the car more often.
Conventional interaction design methodologies cannot fully encompass the redefined relationships ... more Conventional interaction design methodologies cannot fully encompass the redefined relationships between humans and increasingly intelligent technology. New methods are necessary to address interaction at early stages in the design process. Both design metaphors and enactment techniques have been suggested, and this paper explores whether a combination of these can support the design of future interactions. Across three workshops, 27 participants utilised the combination to design the interaction with an automated driving system. Analysis shows that the method combination supported imagining and designing; metaphors aided the creation of a joint conceptual vision of the relationship, and the enactment created tangible experiences and contextualisation of the design concepts. Jointly the methods brought together multi-disciplinary teams in a shared vision, by acting as a shared language and enacted representations of insights that could be engaged with and experienced together.
2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2016
This tutorial is a hands-on introduction to human-centered design topics and practices for human-... more This tutorial is a hands-on introduction to human-centered design topics and practices for human-robot interaction. It is intended for researchers with a variety of backgrounds, particularly those with little or no prior experience in design. In the morning, participants will learn about user needs and needfinding, as ways to understand the stakeholders in research outcomes, guide the selection of participants, and as possible measures of success. We then focus on design sketching, including ways to represent objects, people and their interactions through storyboards. Design sketching is not intended to be art, rather a way to develop and build upon ideas with oneself, and quickly communicate with colleagues. In the afternoon, participants will use the tools and materials, and learn techniques for lightweight physical prototyping and improvisation. Participants will build a small paper robot (not actuated) of their own design, to practice puppeteering, explore bodily movement and prototype interactions.
Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI '14, 2014
As autonomous cars gain popularity, the importance of studying user experience (UX) in autonomous... more As autonomous cars gain popularity, the importance of studying user experience (UX) in autonomous cars becomes increasingly important. Because validated UX measures specific to autonomous driving have not been developed, we identified several factors of interest common to researchers working at the intersection of autonomous driving, driving simulators, and user experience. We have collected corresponding validated questionnaires to create a comprehensive inventory. We based our selection on attributes such as length of the questionnaires, validation, and prevalence of use such that our work may contribute to an easy and fast setup of high quality questionnaires for the study of UX in autonomous cars. In this extended abstract, we recap the factors we have inventoried.
Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2014
How should information be exchanged between drivers and their cars? What types of information sho... more How should information be exchanged between drivers and their cars? What types of information should be exchanged? These questions have become increasingly important as research into autonomous cars interfaces continues to develop. To examine these questions, a varied and complex simulation course has been created for the Stanford Driving Simulator together with a Wizard of Oz Station for operators to manipulate the driving simulation. Outlined in this paper is a Wizard of Oz study that observes/records the effects of "Push vs. Pull" of driving information from the autonomous car, as well as the use of "What vs. Why" information. The objective of this research is to provide a set of psychological principles that will guide the driver-vehicle interface design in providing effective, real-time support for drivers of increasingly autonomous vehicles.
Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI '14, 2014
New computation and sensing capabilities in road vehicles present possibilities for advanced driv... more New computation and sensing capabilities in road vehicles present possibilities for advanced driver assistance systems that can increase safety and efficiency, if the driver will trust them appropriately and use them properly. The two-stage 'trust fall' is a way to study trust in automated systems by testing whether trust established in normal circumstances transfers to trust under extreme circumstances, which will be essential for the successful employment of new automotive systems. Understanding the mental models drivers create of advanced systems, and how they use those mental models to share control with the computer will be crucial to successful design.
Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2014
Present study on cognitive workload in driving focuses on reduction of workload for better drivin... more Present study on cognitive workload in driving focuses on reduction of workload for better driving performance. In this paper, we talk about the cognitive load in drivers of autonomous cars and their performance under multiple cognitive loads. Our previous studies have indicated that low to no workload is likely to induce drowsiness in drivers of autonomous vehicles. We hypothesize that there is an optimal cognitive load for a driver during autonomous driving for best performance after transfer of control from autonomous to manual. We propose an experiment to study the cognitive load on the driver of a simulated autonomous car and the effects on manual driving performance. We also describe our use of biometric devices to obtain physiological measures indicative of cognitive workload.
Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 2021
The sound a robot or automated system makes and the sounds it listens for in our shared acoustic ... more The sound a robot or automated system makes and the sounds it listens for in our shared acoustic environment can greatly expand its contextual understanding and to shape its behaviors to the interactions it is trying to perform. People convey significant information with sound in interpersonal communication in social contexts. Para-linguistic information about where we are, how loud we're speaking, or if we sound happy, sad or upset are relevant to understand for a robot that looks to adapt its interactions to be socially appropriate. Similarly, the qualities of the sound an object makes can change how people perceive that object and can alter whether or not it attracts attention, interrupts other interactions, reinforces or contradicts an emotional expression, and as such should be aligned with the designer's intention for the object. In this tutorial, we will introduce the participants to software and design methods to help robots recognize and generate sound for human-robot interaction (HRI). Using open-source tools and methods designers can apply to their own robots, we seek to increase the application of sound to robot design and stimulate HRI research in robot sound.
Automatic doors exemplify the challenges of designing emotionally welcoming interactive systems—a... more Automatic doors exemplify the challenges of designing emotionally welcoming interactive systems—a critical issue in the design of any system of incidental use. We attempt to broaden the automatic door’s repertoire of signals by examining how people respond to a variety of “door gestures ” designed to offer different levels of approachability. In a pilot study, participants (N=48) who walked past a physical gesturing door were asked to fill out a questionnaire about that experience. In our follow-up study, participants (N=51) viewed 12 video clips depicting a person walking toward and past an automatic door that moved with different speeds and trajectories. In both studies, our Likert-scale measures and open-ended responses indicated significant uniformity in participants ’ interpretation of the behaviour of the door prototypes. The participants saw these motions as gestures with human-like characteristics such as cognition and intent. Our work suggests that even in non-anthropomorph...
Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2018
How will pedestrians from different regions interact with an approaching autonomous vehicle? Unde... more How will pedestrians from different regions interact with an approaching autonomous vehicle? Understanding differences in pedestrian culture and responses can help inform autonomous cars how to behave appropriately in different regional contexts. We conducted a field study comparing the behavioral response of pedestrians between metropolitan Mexico City (N=113) and Colima, a smaller coastal city (N=81). We hid a driver in a car seat costume as a Wizard-of-Oz prototype to evoke pedestrian interaction behavior at a crosswalk or street. Pedestrian interactions were coded for crossing decision, crossing pathway, pacing, and observational behavior. Most distinctly, pedestrians in Mexico City kept their pace and more often crossed in front of the vehicle, while those in Colima stopped in front of the car more often.
Conventional interaction design methodologies cannot fully encompass the redefined relationships ... more Conventional interaction design methodologies cannot fully encompass the redefined relationships between humans and increasingly intelligent technology. New methods are necessary to address interaction at early stages in the design process. Both design metaphors and enactment techniques have been suggested, and this paper explores whether a combination of these can support the design of future interactions. Across three workshops, 27 participants utilised the combination to design the interaction with an automated driving system. Analysis shows that the method combination supported imagining and designing; metaphors aided the creation of a joint conceptual vision of the relationship, and the enactment created tangible experiences and contextualisation of the design concepts. Jointly the methods brought together multi-disciplinary teams in a shared vision, by acting as a shared language and enacted representations of insights that could be engaged with and experienced together.
2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), 2016
This tutorial is a hands-on introduction to human-centered design topics and practices for human-... more This tutorial is a hands-on introduction to human-centered design topics and practices for human-robot interaction. It is intended for researchers with a variety of backgrounds, particularly those with little or no prior experience in design. In the morning, participants will learn about user needs and needfinding, as ways to understand the stakeholders in research outcomes, guide the selection of participants, and as possible measures of success. We then focus on design sketching, including ways to represent objects, people and their interactions through storyboards. Design sketching is not intended to be art, rather a way to develop and build upon ideas with oneself, and quickly communicate with colleagues. In the afternoon, participants will use the tools and materials, and learn techniques for lightweight physical prototyping and improvisation. Participants will build a small paper robot (not actuated) of their own design, to practice puppeteering, explore bodily movement and prototype interactions.
Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI '14, 2014
As autonomous cars gain popularity, the importance of studying user experience (UX) in autonomous... more As autonomous cars gain popularity, the importance of studying user experience (UX) in autonomous cars becomes increasingly important. Because validated UX measures specific to autonomous driving have not been developed, we identified several factors of interest common to researchers working at the intersection of autonomous driving, driving simulators, and user experience. We have collected corresponding validated questionnaires to create a comprehensive inventory. We based our selection on attributes such as length of the questionnaires, validation, and prevalence of use such that our work may contribute to an easy and fast setup of high quality questionnaires for the study of UX in autonomous cars. In this extended abstract, we recap the factors we have inventoried.
Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2014
How should information be exchanged between drivers and their cars? What types of information sho... more How should information be exchanged between drivers and their cars? What types of information should be exchanged? These questions have become increasingly important as research into autonomous cars interfaces continues to develop. To examine these questions, a varied and complex simulation course has been created for the Stanford Driving Simulator together with a Wizard of Oz Station for operators to manipulate the driving simulation. Outlined in this paper is a Wizard of Oz study that observes/records the effects of "Push vs. Pull" of driving information from the autonomous car, as well as the use of "What vs. Why" information. The objective of this research is to provide a set of psychological principles that will guide the driver-vehicle interface design in providing effective, real-time support for drivers of increasingly autonomous vehicles.
Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications - AutomotiveUI '14, 2014
New computation and sensing capabilities in road vehicles present possibilities for advanced driv... more New computation and sensing capabilities in road vehicles present possibilities for advanced driver assistance systems that can increase safety and efficiency, if the driver will trust them appropriately and use them properly. The two-stage 'trust fall' is a way to study trust in automated systems by testing whether trust established in normal circumstances transfers to trust under extreme circumstances, which will be essential for the successful employment of new automotive systems. Understanding the mental models drivers create of advanced systems, and how they use those mental models to share control with the computer will be crucial to successful design.
Adjunct Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2014
Present study on cognitive workload in driving focuses on reduction of workload for better drivin... more Present study on cognitive workload in driving focuses on reduction of workload for better driving performance. In this paper, we talk about the cognitive load in drivers of autonomous cars and their performance under multiple cognitive loads. Our previous studies have indicated that low to no workload is likely to induce drowsiness in drivers of autonomous vehicles. We hypothesize that there is an optimal cognitive load for a driver during autonomous driving for best performance after transfer of control from autonomous to manual. We propose an experiment to study the cognitive load on the driver of a simulated autonomous car and the effects on manual driving performance. We also describe our use of biometric devices to obtain physiological measures indicative of cognitive workload.
Uploads
Papers by Wendy Ju