How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots
"> Figure 1
<p>Examples of non-humanoid robots from the reviewed articles [<a href="#B13-mti-05-00084" class="html-bibr">13</a>,<a href="#B19-mti-05-00084" class="html-bibr">19</a>,<a href="#B20-mti-05-00084" class="html-bibr">20</a>,<a href="#B22-mti-05-00084" class="html-bibr">22</a>,<a href="#B23-mti-05-00084" class="html-bibr">23</a>,<a href="#B26-mti-05-00084" class="html-bibr">26</a>,<a href="#B27-mti-05-00084" class="html-bibr">27</a>,<a href="#B29-mti-05-00084" class="html-bibr">29</a>,<a href="#B49-mti-05-00084" class="html-bibr">49</a>,<a href="#B54-mti-05-00084" class="html-bibr">54</a>,<a href="#B55-mti-05-00084" class="html-bibr">55</a>,<a href="#B57-mti-05-00084" class="html-bibr">57</a>].</p> ">
Abstract
:1. Introduction
2. Background
2.1. Emotion in Social Robotics
2.2. Humanoid Robots vs. Non-Humanoid Robots
2.3. Current AV–Pedestrian Interaction
2.4. Why Ascribe Emotions to AVs
3. Method
3.1. Search Strategy
3.1.1. Database Selection
3.1.2. Keyword Search Procedure
3.1.3. Article Selection
3.2. Research Questions
- What emotions are commonly expressed by non-humanoid robots?
- How are the emotions displayed?
- What measures are used to evaluate the emotional expressions?
- What are the user perceptions of the emotional expressions?
4. Review of Emotionally Expressive Non-Humanoid Robots
4.1. Overview
4.2. Emotion Models
4.2.1. Categorical Models
4.2.2. Dimensional Models
4.2.3. Emotional Personas
4.3. Output Modalities
4.3.1. Visual Modalities
4.3.2. Auditory Modalities
4.3.3. Haptic Modalities
4.4. Evaluation Measures
4.4.1. Use Scenarios
4.4.2. Experimental Tasks
4.4.3. Evaluated Aspects
4.5. User Perceptions
4.5.1. Recognition of Emotional Expressions
4.5.2. Sociability
4.5.3. Contexts
5. Considerations
5.1. Considerations for Designing Emotional Expressions
5.2. Considerations for Building Affective AV–Pedestrian Interfaces
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Litman, T. Autonomous Vehicle Implementation Predictions; Victoria Transport Policy Institute: Victoria, BC, Canada, 2017. [Google Scholar]
- Rasouli, A.; Tsotsos, J.K. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE Trans. Intell. Transp. Syst. 2019, 21, 900–918. [Google Scholar] [CrossRef] [Green Version]
- Lagström, T.; Malmsten Lundgren, V. AVIP-Autonomous Vehicles’ Interaction with Pedestrians-An Investigation of Pedestrian-Driver Communication and Development of a Vehicle External Interface. Master’s Thesis, Chalmers University of Technology, Gothenburg, Sweden, 2016. [Google Scholar]
- Mahadevan, K.; Somanath, S.; Sharlin, E. Communicating Awareness and Intent in Autonomous Vehicle-Pedestrian Interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–12. [Google Scholar]
- Nguyen, T.T.; Holländer, K.; Hoggenmueller, M.; Parker, C.; Tomitsch, M. Designing for Projection-Based Communication between Autonomous Vehicles and Pedestrians. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Online, 21–25 September 2019; Association for Computing Machinery: New York, NY, USA, AutomotiveUI ’19; 2019; pp. 284–294. [Google Scholar] [CrossRef]
- Pratticò, F.G.; Lamberti, F.; Cannavò, A.; Morra, L.; Montuschi, P. Comparing State-of-the-Art and Emerging Augmented Reality Interfaces for Autonomous Vehicle-to-Pedestrian Communication. IEEE Trans. Veh. Technol. 2021, 70, 1157–1168. [Google Scholar] [CrossRef]
- Chang, C.M.; Toda, K.; Sakamoto, D.; Igarashi, T. Eyes on a Car: An Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2017; AutomotiveUI ’17; pp. 65–73. [Google Scholar] [CrossRef]
- Löcken, A.; Golling, C.; Riener, A. How Should Automated Vehicles Interact with Pedestrians? A Comparative Analysis of Interaction Concepts in Virtual Reality. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2019; AutomotiveUI ’19; pp. 262–274. [Google Scholar] [CrossRef]
- Cauchard, J.R.; Zhai, K.Y.; Spadafora, M.; Landay, J.A. Emotion encoding in human-drone interaction. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 263–270. [Google Scholar]
- Picard, R.W. Affective computing: Challenges. Int. J. Hum.-Comput. Stud. 2003, 59, 55–64. [Google Scholar] [CrossRef]
- Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166. [Google Scholar] [CrossRef] [Green Version]
- Leite, I.; Martinho, C.; Paiva, A. Social robots for long-term interaction: A survey. Int. J. Soc. Robot. 2013, 5, 291–308. [Google Scholar] [CrossRef]
- Bretan, M.; Hoffman, G.; Weinberg, G. Emotionally expressive dynamic physical behaviors in robots. Int. J. Hum.-Comput. Stud. 2015, 78, 1–16. [Google Scholar] [CrossRef]
- Gácsi, M.; Kis, A.; Faragó, T.; Janiak, M.; Muszyński, R.; Miklósi, Á. Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour. Comput. Hum. Behav. 2016, 59, 411–419. [Google Scholar] [CrossRef] [Green Version]
- Ritschel, H.; Aslan, I.; Mertes, S.; Seiderer, A.; André, E. Personalized synthesis of intentional and emotional non-verbal sounds for social robots. In Proceedings of the 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), Cambridge, UK, 3–6 September 2019; pp. 1–7. [Google Scholar]
- Eyssel, F.; Hegel, F.; Horstmann, G.; Wagner, C. Anthropomorphic inferences from emotional nonverbal cues: A case study. In Proceedings of the 19th international symposium in robot and human interactive communication, Viareggio, Italy, 13–15 September 2010; pp. 646–651. [Google Scholar]
- Löffler, D.; Schmidt, N.; Tscharn, R. Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery: New York, NY, USA, 2018; HRI ’18; pp. 334–343. [Google Scholar] [CrossRef]
- Boccanfuso, L.; Kim, E.S.; Snider, J.C.; Wang, Q.; Wall, C.A.; DiNicola, L.; Greco, G.; Flink, L.; Lansiquot, S.; Ventola, P.; et al. Autonomously detecting interaction with an affective robot to explore connection to developmental ability. In Proceedings of the 2015 International Conference on Affective Computing and Intelligent Interaction (ACII), Xi’an, China, 21–24 September 2015; pp. 1–7. [Google Scholar]
- Herdel, V.; Kuzminykh, A.; Hildebrandt, A.; Cauchard, J.R. Drone in Love: Emotional Perception of Facial Expressions on Flying Robots. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar]
- Whittaker, S.; Rogers, Y.; Petrovskaya, E.; Zhuang, H. Designing Personas for Expressive Robots: Personality in the New Breed of Moving, Speaking, and Colorful Social Home Robots. J. Hum.-Robot Interact. 2021, 10. [Google Scholar] [CrossRef]
- Harris, J.; Sharlin, E. Exploring the affect of abstract motion in social human-robot interaction. In Proceedings of the 2011 Ro-Man, Atlanta, GA, USA, 31 July–3 August 2011; pp. 441–448. [Google Scholar]
- Hoggenmueller, M.; Chen, J.; Hespanhol, L. Emotional Expressions of Non-Humanoid Urban Robots: The Role of Contextual Aspects on Interpretations; Association for Computing Machinery: New York, NY, USA, 2020; PerDis ’20; pp. 87–95. [Google Scholar] [CrossRef]
- Tennent, H.; Moore, D.; Ju, W. Character Actor: Design and Evaluation of Expressive Robot Car Seat Motion. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 1, 1–23. [Google Scholar] [CrossRef]
- Monceaux, J.; Becker, J.; Boudier, C.; Mazel, A. Demonstration: First Steps in Emotional Expression of the Humanoid Robot Nao. In Proceedings of the 2009 International Conference on Multimodal Interfaces; Association for Computing Machinery: New York, NY, USA, 2009; ICMI-MLMI ’09; pp. 235–236. [Google Scholar] [CrossRef]
- Pandey, A.K.; Gelin, R. A mass-produced sociable humanoid robot: Pepper: The first machine of its kind. IEEE Robot. Autom. Mag. 2018, 25, 40–48. [Google Scholar] [CrossRef]
- Novikova, J.; Watts, L. A Design Model of Emotional Body Expressions in Non-Humanoid Robots. In Proceedings of the Second International Conference on Human-Agent Interaction; Association for Computing Machinery: New York, NY, USA, 2014; HAI ’14; pp. 353–360. [Google Scholar] [CrossRef] [Green Version]
- Song, S.; Yamada, S. Expressing Emotions through Color, Sound, and Vibration with an Appearance-Constrained Social Robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery: New York, NY, USA, 2017; HRI ’17; pp. 2–11. [Google Scholar] [CrossRef] [Green Version]
- Song, S.; Yamada, S. Designing Expressive Lights and In-Situ Motions for Robots to Express Emotions. In Proceedings of the 6th International Conference on Human-Agent Interaction; Association for Computing Machinery: New York, NY, USA, 2018; HAI ’18; pp. 222–228. [Google Scholar] [CrossRef] [Green Version]
- Peng, Y.; Feng, Y.L.; Wang, N.; Mi, H. How children interpret robots’ contextual behaviors in live theatre: Gaining insights for multi-robot theatre design. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 327–334. [Google Scholar]
- Park, S.; Healey, P.G.T.; Kaniadakis, A. Should Robots Blush? In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2021. [Google Scholar]
- Dey, D.; Habibovic, A.; Löcken, A.; Wintersberger, P.; Pfleging, B.; Riener, A.; Martens, M.; Terken, J. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transp. Res. Interdiscip. Perspect. 2020, 7, 100174. [Google Scholar] [CrossRef]
- Urmson, C.P.; Mahon, I.J.; Dolgov, D.A.; Zhu, J. Pedestrian Notifications. U.S. Patent 9 196 164 B1, 24 November 2015. [Google Scholar]
- Colley, M.; Belz, J.H.; Rukzio, E. Investigating the Effects of Feedback Communication of Autonomous Vehicles. In 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2021; AutomotiveUI ’21; pp. 263–273. [Google Scholar] [CrossRef]
- Clamann, M.; Aubert, M.; Cummings, M.L. Evaluation of vehicle-to-pedestrian communication displays for autonomous vehicles. In Proceedings of the 96th Annual Transportation Research Board, Washington, DC, USA, 8–12 January 2017. [Google Scholar]
- Hesenius, M.; Börsting, I.; Meyer, O.; Gruhn, V. Don’t Panic! Guiding Pedestrians in Autonomous Traffic with Augmented Reality. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct; Association for Computing Machinery: New York, NY, USA, 2018; MobileHCI ’18; pp. 261–268. [Google Scholar] [CrossRef]
- Mairs, J. Umbrellium Develops Interactive Road Crossing that Only Appears when Needed. 2017. Available online: https://www.dezeen.com/2017/10/12/umbrellium-develops-interactive-road-crossing-that-only-appears-when-needed-technology/ (accessed on 17 December 2021).
- Colley, M.; Rukzio, E. A Design Space for External Communication of Autonomous Vehicles. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2020; AutomotiveUI ’20; pp. 212–222. [Google Scholar] [CrossRef]
- Newcomb, A. Humans Harass and Attack Self-Driving Waymo Cars. NBC News. 22 December 2018. Available online: https://www.nbcnews.com/tech/innovation/humans-harass-attack-self-driving-waymo-cars-n950971 (accessed on 11 October 2021).
- Connor, S. First Self-Driving Cars Will Be Unmarked So That Other Drivers Don’t Try to Bully Them. The Guardian. 30 October 2016. Available online: https://www.theguardian.com/technology/2016/oct/30/volvo-self-driving-car-autonomous (accessed on 11 October 2021).
- Bazilinskyy, P.; Sakuma, T.; de Winter, J. What driving style makes pedestrians think a passing vehicle is driving automatically? Appl. Ergon. 2021, 95, 103428. [Google Scholar] [CrossRef] [PubMed]
- Jayaraman, S.K.; Creech, C.; Tilbury, D.M.; Yang, X.J.; Pradhan, A.K.; Tsui, K.M.; Robert, L.P., Jr. Pedestrian trust in automated vehicles: Role of traffic signal and av driving behavior. Front. Robot. AI 2019, 6, 117. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Garber, M. The Revolution Will Be Adorable: Why Google’s Cars Are So Cute. The Atlantic. 29 May 2014. Available online: https://www.theatlantic.com/technology/archive/2014/05/the-revolution-will-be-adorable-why-googles-driverless-cars-are-so-cute/371699/ (accessed on 11 October 2021).
- D’Onfro, J. Why Google Made Its Self-Driving Car Look So Cute. Business Insider Australia. 24 December 2014. Available online: https://www.businessinsider.com.au/google-self-driving-car-why-its-so-cute-2014-12?r=US&IR=T (accessed on 11 October 2021).
- Sood, G. Honda 2040 NIKO Comes with A Tiny AI Assistant, Taking the Car from A Vehicle to Your Friend! Yanko Design. 21 August 2021. Available online: https://www.yankodesign.com/2021/08/21/honda-2040-niko-comes-with-a-tiny-ai-assistant-taking-the-car-from-a-vehicle-to-your-friend/ (accessed on 11 October 2021).
- Hoffman, G.; Zuckerman, O.; Hirschberger, G.; Luria, M.; Shani Sherman, T. Design and Evaluation of a Peripheral Robotic Conversation Companion. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction; Association for Computing Machinery: New York, NY, USA, 2015; HRI ’15; pp. 3–10. [Google Scholar] [CrossRef]
- Braun, M.; Weber, F.; Alt, F. Affective Automotive User Interfaces–Reviewing the State of Driver Affect Research and Emotion Regulation in the Car. ACM Comput. Surv. 2021, 54. [Google Scholar] [CrossRef]
- Sadeghian, S.; Hassenzahl, M.; Eckoldt, K. An Exploration of Prosocial Aspects of Communication Cues between Automated Vehicles and Pedestrians. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2020; AutomotiveUI ’20; pp. 205–211. [Google Scholar] [CrossRef]
- Lanzer, M.; Babel, F.; Yan, F.; Zhang, B.; You, F.; Wang, J.; Baumann, M. Designing Communication Strategies of Autonomous Vehicles with Pedestrians: An Intercultural Study. In 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2020; AutomotiveUI ’20; pp. 122–131. [Google Scholar] [CrossRef]
- Tan, H.; Tiab, J.; Šabanović, S.; Hornbæk, K. Happy Moves, Sad Grooves: Using Theories of Biological Motion and Affect to Design Shape-Changing Interfaces. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems; Association for Computing Machinery: New York, NY, USA, 2016; DIS ’16; pp. 1282–1293. [Google Scholar] [CrossRef]
- Hieida, C.; Matsuda, H.; Kudoh, S.; Suehiro, T. Action elements of emotional body expressions for flying robots. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 439–440. [Google Scholar]
- Harzing, A. Publish or Perish. 2007. Available online: https://harzing.com/resources/publish-or-perish (accessed on 17 December 2021).
- Shi, Y.; Yan, X.; Ma, X.; Lou, Y.; Cao, N. Designing Emotional Expressions of Conversational States for Voice Assistants: Modality and Engagement. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; CHI EA ’18; pp. 1–6. [Google Scholar] [CrossRef]
- Frederiksen, M.R.; Stoy, K. On the causality between affective impact and coordinated human-robot reactions. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 488–494. [Google Scholar]
- Bucci, P.; Zhang, L.; Cang, X.L.; MacLean, K.E. Is It Happy? Behavioural and Narrative Frame Complexity Impact Perceptions of a Simple Furry Robot’s Emotions. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–11. [Google Scholar]
- Chase, E.D.Z.; Follmer, S. Differences in Haptic and Visual Perception of Expressive 1DoF Motion. In ACM Symposium on Applied Perception 2019; Association for Computing Machinery: New York, NY, USA, 2019; SAP ’19. [Google Scholar] [CrossRef]
- Frederiksen, M.R.; Stoy, K. Robots can defuse high-intensity conflict situations. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 25–29 October 2020; pp. 11376–11382. [Google Scholar]
- Kim, L.H.; Follmer, S. SwarmHaptics: Haptic Display with Swarm Robots. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–13. [Google Scholar]
- Sato, D.; Sasagawa, M.; Niijima, A. Affective Touch Robots with Changing Textures and Movements. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020; pp. 1–6. [Google Scholar]
- Ekman, P.; Friesen, W.V. Constants across cultures in the face and emotion. J. Personal. Soc. Psychol. 1971, 17, 124. [Google Scholar] [CrossRef] [Green Version]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
- Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 1996, 14, 261–292. [Google Scholar] [CrossRef]
- Goldberg, L.R. The development of markers for the Big-Five factor structure. Psychol. Assess. 1992, 4, 26. [Google Scholar] [CrossRef]
- Sharma, M.; Hildebrandt, D.; Newman, G.; Young, J.E.; Eskicioglu, R. Communicating affect via flight path exploring use of the laban effort system for designing affective locomotion paths. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 293–300. [Google Scholar]
- Lenz, E.; Diefenbach, S.; Hassenzahl, M. Exploring Relationships between Interaction Attributes and Experience. In Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces; Association for Computing Machinery: New York, NY, USA, 2013; DPPI ’13; pp. 126–135. [Google Scholar] [CrossRef]
- Read, R.; Belpaeme, T. People interpret robotic non-linguistic utterances categorically. Int. J. Soc. Robot. 2016, 8, 31–50. [Google Scholar] [CrossRef]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- Tran, T.T.M.; Parker, C.; Tomitsch, M. A Review of Virtual Reality Studies on Autonomous Vehicle–Pedestrian Interaction. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 641–652. [Google Scholar] [CrossRef]
- Epke, M.R.; Kooijman, L.; De Winter, J.C. I See Your Gesture: A Vr-Based Study of Bidirectional Communication between Pedestrians and Automated Vehicles. J. Adv. Transp. Available online: https://www.hindawi.com/journals/jat/2021/5573560/ (accessed on 27 April 2021).
- Lee, Y.M.; Madigan, R.; Giles, O.; Garach-Morcillo, L.; Markkula, G.; Fox, C.; Camara, F.; Rothmueller, M.; Vendelbo-Larsen, S.A.; Rasmussen, P.H.; et al. Road users rarely use explicit communication when interacting in today’s traffic: Implications for automated vehicles. Cogn. Technol. Work 2021, 23, 367–380. [Google Scholar] [CrossRef]
- Pillai, A. Virtual Reality Based Study to Analyse Pedestrian Attitude towards Autonomous Vehicles. Master’s Thesis, KTH Roy. Inst. Technol., Stockholm, Sweden, 2017. [Google Scholar]
- Fischer, K.; Jung, M.; Jensen, L.C.; aus der Wieschen, M.V. Emotion expression in HRI–when and why. In Proceedings of the 2019 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019; pp. 29–38. [Google Scholar]
- Albastaki, A.; Hoggenmüller, M.; Robinson, F.A.; Hespanhol, L. Augmenting Remote Interviews through Virtual Experience Prototypes; Association for Computing Machinery: New York, NY, USA, 2020; OzCHI ’20; pp. 78–86. [Google Scholar] [CrossRef]
Year | Authors | Robot Prototype | Emotion Models | Output Modalities |
---|---|---|---|---|
2011 | Harris and Sharlin [21] | The Stem | angry, happy, sad, etc. | movement |
2014 | Novikova and Watts [26] | a Lego robot based on a Phobot robot’s design | Mehrabian’s model | movement |
2015 | Boccanfuso et al. [18] | Sphero | angry, fearful, happy, sad | color, movement, sound |
2015 | Bretan et al. [13] | Shimi | Ekman’s basic emotions, Russell’s circumplex model | movement |
2015 | Hoffman et al. [45] | Kip1 | calm, curious, scared | movement |
2016 | Cauchard et al. [9] | AR.Drone 2.0 by Parrot | personalities from Walt Disney’s Seven Dwarfs and Peyo’s Smurfs | movement |
2016 | Gácsi et al. [14] | PeopleBot | Ekman’s basic emotions | movement, sound |
2016 | Hieida et al. [50] | “Rolling Spider” Drone by Parrot | anger, joy, pleasure, and sadness from a Japanese idiom | movement |
2016 | Tan et al. [49] | a shape-changing interface | Ekman’s basic emotions, Mehrabian’s model | movement |
2017 | Song and Yamada [27] | Maru | Russell’s circumplex model | color, movement, sound |
2018 | Bucci et al. [54] | FlexiBit with a fur cover | emotional valence | haptics, movement |
2018 | Löffler et al. [17] | a simple, wheeled robot probe | Ekman’s basic emotions | color, movement, sound |
2018 | Shi et al. [52] | a smartphone-based voice assistant | Russell’s circumplex model | facial expression, movement |
2018 | Song and Yamada [28] | Roomba | Ekman’s basic emotions, Russell’s circumplex model | color, movement |
2018 | Tennent et al. [23] | 3D animation for a robot car seat | aggressive, confident, cool, excited, quirky | movement |
2019 | Chase and Follmer [55] | a device with a visible and graspable handle | Mehrabian’s model | haptics, movement |
2019 | Kim and Follmer [57] | SwarmHaptics | Ekman’s basic emotions | haptics, movement |
2019 | Ritschel et al. [15] | BärBot | happiness, sadness, etc. | sound |
2020 | Frederiksen and Stoy [53] | three Thymio II Robots | fear | movement, sound |
2020 | Frederiksen and Stoy [56] | Affecta | remorse | color, facial expression, movement, sound |
2020 | Hoggenmueller et al. [22] | Woodie | Ekman’s basic emotions | color, movement |
2020 | Peng et al. [29] | three small robot characters for a robot theater | Ekman’s basic emotions | facial expression, movement |
2020 | Sato et al. [58] | tabletop, wheeled haptic robots | Russell’s circumplex model | haptics |
2021 | Herdel et al. [19] | animated drone with DJI Phantom 3 body | Ekman’s basic emotions | facial expression |
2021 | Whittaker et al. [20] | Olly | “Big Five” personality theory | color, movement, sound |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Y.; Hespanhol, L.; Tomitsch, M. How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technol. Interact. 2021, 5, 84. https://doi.org/10.3390/mti5120084
Wang Y, Hespanhol L, Tomitsch M. How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technologies and Interaction. 2021; 5(12):84. https://doi.org/10.3390/mti5120084
Chicago/Turabian StyleWang, Yiyuan, Luke Hespanhol, and Martin Tomitsch. 2021. "How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots" Multimodal Technologies and Interaction 5, no. 12: 84. https://doi.org/10.3390/mti5120084
APA StyleWang, Y., Hespanhol, L., & Tomitsch, M. (2021). How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technologies and Interaction, 5(12), 84. https://doi.org/10.3390/mti5120084