Nothing Special   »   [go: up one dir, main page]

Skip to main content
Log in

How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human–Robot Interaction

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

As robots become more prevalent in society, investigating the interactions between humans and robots is important to ensure that these robots adhere to the social norms and expectations of human users. In particular, it is important to explore exactly how the nonverbal behaviors of robots influence humans due to the dominant role nonverbal communication plays in social interactions. In this paper, we present a detailed survey on this topic focusing on four main nonverbal communication modes: kinesics, proxemics, haptics, and chronemics, as well as multimodal combinations of these modes. We uniquely investigate findings that span across these different nonverbal modes and how they influence humans in four separate ways: shifting cognitive framing, eliciting emotional responses, triggering specific behavioral responses, and improving task performance. A detailed discussion is presented to provide insights on nonverbal robot behaviors with respect to the aforementioned influence types and to discuss future research directions in this field.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. Dautenhahn K (2007) Socially intelligent robots: dimensions of human–robot interaction. Philos Trans R Soc B 362:679–704. https://doi.org/10.1098/rstb.2006.2004

    Article  Google Scholar 

  2. Tapus A, Mataric MJ, Scassellati B (2007) The grand challenges in socially assistive robotics. IEEE Robot Autom Mag 14:1–7. https://doi.org/10.1109/MRA.2010.940150

    Article  Google Scholar 

  3. Nejat G, Ficocelli M (2008) Can I be of assistance? The intelligence behind an assistive robot. Proc IEEE Int Conf Robot Autom. https://doi.org/10.1109/ROBOT.2008.4543756

    Article  Google Scholar 

  4. Taheri A, Meghdari A, Alemi M, Pouretemad H (2017) Human–robot interaction in autism treatment: a case study on three pairs of autistic children as twins, siblings, and classmates. Int J Soc Robot. https://doi.org/10.1007/s12369-017-0433-8

    Article  Google Scholar 

  5. Chan J, Nejat G (2012) Social intelligence for a robot engaging people in cognitive training activities. Int J Adv Robot Syst 9:1–13. https://doi.org/10.5772/51171

    Article  Google Scholar 

  6. Nourbakhsh IR, Bobenage J, Grange S et al (1999) An affective mobile robot educator with a full-time job. Artif Intell 114:95–124. https://doi.org/10.1016/S0004-3702(99)00027-2

    Article  MATH  Google Scholar 

  7. Li J, Louie W-YG, Mohamed S, et al (2016) A user-study with Tangy the Bingo facilitating robot and long-term care residents. IEEE Int Symp Robot Intell Sensors (in print)

  8. Fong T, Nourbakhsh I, Dautenhahn K (2003) A survey of socially interactive robots: concepts, design, and applications. Robot Auton Syst 42:143–166. https://doi.org/10.1016/S0921-8890(02)00372-X

    Article  MATH  Google Scholar 

  9. Bethel CL, Murphy RR (2008) Survey of non-facial/non-verbal affective expressions for appearance-constrained robots. IEEE Trans Syst Man Cybern Part C Appl Rev 38:83–92. https://doi.org/10.1109/TSMCC.2007.905845

    Article  Google Scholar 

  10. Doroodgar B, Ficocelli M, Mobedi B, Nejat G (2010) The search for survivors: cooperative human–robot interaction in search and rescue environments using semi-autonomous robots. Proc IEEE Int Conf Robot Autom. https://doi.org/10.1109/ROBOT.2010.5509530

    Article  Google Scholar 

  11. Broadbent E (2017) Interactions with robots: the truths we reveal about ourselves. Annu Rev Psychol 68:627–652. https://doi.org/10.1146/annurev-psych-010416-043958

    Article  Google Scholar 

  12. Sidner CL, Lee C, Kidd CD et al (2005) Explorations in engagement for humans and robots. Artif Intell 166:140–164. https://doi.org/10.1016/j.artint.2005.03.005

    Article  Google Scholar 

  13. Burgoon JK, Guerrero LK, Floyd K (2016) Nonverbal communication. Routledge, New York

    Book  Google Scholar 

  14. Bell C (1844) The anatomy and philosophy of expression. John Murray, London

    Book  Google Scholar 

  15. Darwin C (1873) The expression of the emotions in man and animals. John Murray, London

    Google Scholar 

  16. Mehrabian A, Ferris SR (1967) Inference of attitudes from nonverbal communication in two channels. J Consult Psychol 31:248–252. https://doi.org/10.1037/h0024648

    Article  Google Scholar 

  17. Mehrabian A, Wiener M (1967) Decoding of inconsistent communications. J Personal Soc Psychol 6:109–114. https://doi.org/10.1037/h0024532

    Article  Google Scholar 

  18. Philpott JS (1983) The relative contribution to meaning of verbal and nonverbal channels of communication: a meta-analysis. University of Nebraska, Lincoln

    Google Scholar 

  19. Birdwhistell RL (1955) Background to kinesics. ETC A Rev Gen Semant 13:10–28

    Google Scholar 

  20. Jones RG (2013) Communication in the real world: an introduction to communication studies. University of Minnesota Libraries Publishing

  21. Ekman P, Friesen WV (1969) The repertoire of nonverbal behavior: categories, origins, usage, and coding. Semiotica. https://doi.org/10.1515/semi.1969.1.1.49

    Article  Google Scholar 

  22. Poyatos F (1977) The morphological and functional approach to kinesics in the context of interaction and culture. Semiotica 20:197–228

    Article  Google Scholar 

  23. Hall ET (1966) The hidden dimension. Doubleday Company, Chicago

    Google Scholar 

  24. Frank LK (1958) Tactile communication. ETC A Rev Gen Semant 16:31–79

    Google Scholar 

  25. Bruneau TJ (1980) Chronemics and the verbalnonverbal interface. In: The relationship of verbal and nonverbal communication, Mouton Press, p 101

  26. McColl D, Hong A, Hatakeyama N et al (2016) A survey of autonomous human affect detection methods for social robots engaged in natural HRI. J Intell Robot Syst Theory Appl 82:101–133. https://doi.org/10.1007/s10846-015-0259-2

    Article  Google Scholar 

  27. Nehaniv CL, Dautenhahn K, Kubacki J et al (2005) A methodological approach relating the classification of gesture to identification of human intent in the context of human–robot interaction. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, pp 371–377

  28. Admoni H, Scassellati B (2017) Social eye gaze in human–robot interaction: a review. J Hum Robot Interact 6:25–63. https://doi.org/10.5898/JHRI.6.1.Admoni

    Article  Google Scholar 

  29. Rios-Martinez J, Spalanzani A, Laugier C (2015) From proxemics theory to socially-aware navigation: a survey. Int J Soc Robot 7:137–153. https://doi.org/10.1007/s12369-014-0251-1

    Article  Google Scholar 

  30. Kruse T, Pandey AK, Alami R, Kirsch A (2013) Human-aware robot navigation: a survey. Robot Auton Syst 61:1726–1743. https://doi.org/10.1016/j.robot.2013.05.007

    Article  Google Scholar 

  31. De Santis A, Siciliano B, De Luca A, Bicchi A (2008) An atlas of physical human–robot interaction. Mech Mach Theory 43:253–270. https://doi.org/10.1016/j.mechmachtheory.2007.03.003

    Article  MATH  Google Scholar 

  32. Van Erp JBF, Toet A (2013) How to touch humans: guidelines for social agents and robots that can touch. In: Proceedings—2013 humaine association conference on affective computing and intelligent interaction. (ACII), pp 780–785

  33. Argall BD, Billard AG (2010) A survey of tactile human–robot interactions. Robot Auton Syst 58:1159–1176. https://doi.org/10.1016/j.robot.2010.07.002

    Article  Google Scholar 

  34. Chong D, Druckman JN (2007) Framing theory. Annu Rev Polit Sci 10:103–126. https://doi.org/10.1146/annurev.polisci.10.072805.103054

    Article  Google Scholar 

  35. Neumann R, Strack F (2000) “Mood contagion”: the automatic transfer of mood between persons. J Personal Soc Psychol 79:211–223. https://doi.org/10.1037//0022-3514.79.2.211

    Article  Google Scholar 

  36. Birdwhistell RL (2010) Kinesics and context: essays on body motion communication. University of Pennsylvania Press, Philadelphia

    Google Scholar 

  37. Bremmer J, Roodenburg H (1992) A cultural history of gesture. Cornell University Press, New York

    Google Scholar 

  38. Streeck J (1993) Gesture as communication I: its coordination with gaze and speech. Commun Monogr 60:275–299. https://doi.org/10.1080/03637759309376314

    Article  Google Scholar 

  39. McNeill D (1992) Guide to gesture classification, transcription and distribution. Hand and mind: what gestures reveal about thought. The University of Chicago Press, Chicago, pp 75–104

    Google Scholar 

  40. Salem M, Rohlfing K, Kopp S, Joublin F (2011) A friendly gesture: Investigating the effect of multimodal robot behavior in human–robot interaction a friendly gesture : investigating the effect of multimodal robot behavior in human–robot interaction. In: RO-MAN. IEEE, pp 247–252

  41. Salem M, Kopp S, Wachsmuth I et al (2012) Generation and evaluation of communicative robot gesture. Int J Soc Robot 4:201–217. https://doi.org/10.1007/s12369-011-0124-9

    Article  Google Scholar 

  42. Salem M, Eyssel F, Rohlfing K et al (2013) To err is human (-like): effects of robot gesture on perceived anthropomorphism and likability. Int J Soc Robot 5:313–323. https://doi.org/10.1007/s12369-013-0196-9

    Article  Google Scholar 

  43. Aly A, Tapus A (2016) Towards an intelligent system for generating an adapted verbal and nonverbal combined behavior in human–robot interaction. Auton Robots 40:193–209. https://doi.org/10.1007/s10514-015-9444-1

    Article  Google Scholar 

  44. Shen Q, Dautenhahn K, Saunders J, Kose H (2015) Can real-time, adaptive human–robot motor coordination improve humans’ overall perception of a robot? IEEE Trans Auton Ment Dev 7:52–64. https://doi.org/10.1109/TAMD.2015.2398451

    Article  Google Scholar 

  45. Peters R, Broekens J, Neerincx MA (2017) Robots educate in style: the effect of context and non-verbal behaviour on children’s perceptions of warmth and competence. In: International symposium on robot and human interactive communication. IEEE, pp 449–455

  46. Leary T (1958) Interpersonal diagnosis of personality. Am J Phys Med Rehabil 37:331

    Google Scholar 

  47. Fiske ST, Cuddy AJC, Glick P (2007) Universal dimensions of social cognition: warmth and competence. Trends Cogn Sci 11:77–83. https://doi.org/10.1016/J.TICS.2006.11.005

    Article  Google Scholar 

  48. Xu J, Broekens J, Hindriks K, Neerincx MA (2014) Effects of bodily mood expression of a robotic teacher on students. In: IEEE international conference on intelligent robots and systems. IEEE/RSJ, pp 2614–2620

  49. Xu J, Broekens J, Hindriks K, Neerincx MA (2015) Mood contagion of robot body language in human robot interaction. Auton Agent Multi Agent Syst 29:1216–1248. https://doi.org/10.1007/s10458-015-9307-3

    Article  Google Scholar 

  50. Xu J, Broekens J, Hindriks K, Neerincx MA (2013) Mood expression through parameterized functional behavior of robots. In: International workshop on robot and human interactive communication. IEEE, pp 533–540

  51. English BA, Coates A, Howard A (2017) Recognition of gestural behaviors expressed by humanoid robotic platforms for teaching affect recognition to children with autism—a healthy subjects pilot study. In: International conference on social robotics. Springer, Cham, pp 567–576

    Chapter  Google Scholar 

  52. Lorenz T, Mörtl A, Hirche S (2013) Movement synchronization fails during non-adaptive human–robot interaction. In: Proceedings of the 8th ACM/IEEE international conference on human–robot interaction. IEEE Press, pp 189–190

  53. Mörtl A, Lorenz T, Vlaskamp BNS et al (2012) Modeling inter-human movement coordination: synchronization governs joint task dynamics. Biol Cybern 106:241–259. https://doi.org/10.1007/s00422-012-0492-8

    Article  MathSciNet  MATH  Google Scholar 

  54. Ansermin E, Mostafaoui G, Sargentini X, Gaussier P (2017) Unintentional entrainment effect in a context of human robot interaction: an experimental study. In: International symposium on robot and human interactive communication. IEEE

  55. Ende T, Haddadin S, Parusel S, et al (2011) A Human-centered approach to robot gesture based communication within collaborative working processes. In: International conference on intelligent robots and systems. IEEE/RSJ, San Francisco, CA, pp 3367–3374

  56. Riek LD, Rabinowitch T, Bremner P et al (2010) Cooperative Gestures : effective signaling for humanoid robots. In: HRI 2010. ACM/IEEE, Osaka, Japan, pp 61–68

  57. Dijk ET, Torta E, Cuijpers RH (2013) Effects of eye contact and iconic gestures on message retention in human–robot interaction. Int J Soc Robot 5:491–501

    Article  Google Scholar 

  58. Sheikholeslami S, Aj Moon, Croft EA (2017) Cooperative gestures for industry: exploring the efficacy of robot hand configurations in expression of instructional gestures for human–robot interaction. Int J Robot Res 36:699–720. https://doi.org/10.1177/0278364917709941

    Article  Google Scholar 

  59. Quintero CP, Tatsambon R, Gridseth M, Jagersand M (2015) Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task. In: IEEE international symposium on robot and human interactive communication (RO-MAN). IEEE, pp 349–354

  60. Scheflen AE (1964) The significance of posture in communicative systems. Psychiatry 27:316–331. https://doi.org/10.1080/00332747.1964.11023403

    Article  Google Scholar 

  61. McClave EZ (2000) Linguistic functions of head movements in the context of speech. J Pragmat 32:855–878. https://doi.org/10.1016/S0378-2166(99)00079-X

    Article  Google Scholar 

  62. Hoffman G, Zuckerman O, Hirschberger G et al (2015) Design and evaluation of a peripheral robotic conversation companion. In: Proc of tenth annual ACM/IEEE international conference on human–robot interaction—HRI’15, pp 3–10. https://doi.org/10.1145/2696454.2696495

  63. Rosenthal-von der Pütten AM, Krämer NC, Herrmann J (2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int J Soc Robot. https://doi.org/10.1007/s12369-018-0466-7

    Article  Google Scholar 

  64. Choi M, Kornfield R, Takayama L, Mutlu B (2017) Movement matters: effects of motion and mimicry on perception of similarity and closeness in robot-mediated communication. In: CHI conference on human factors in computing systems, pp 325–335

  65. Wang E, Lignos C, Vatsal A, Scassellati B (2006) Effects of head movement on perceptions of humanoid robot behavior. In: Proceeding of the 1st ACM SIGCHI/SIGART conference on Human–robot interaction. ACM, pp 180–185

  66. McColl D, Nejat G (2014) Recognizing emotional body language displayed by a human-like social robot. Int J Soc Robot 6:261–280. https://doi.org/10.1007/s12369-013-0226-7

    Article  Google Scholar 

  67. Wallbott HG (1998) Bodily expression of emotion. Eur J Soc Psychol 28:879–896. https://doi.org/10.1002/(SICI)1099-0992(1998110)28:6%3c879:AID-EJSP901%3e3.0.CO;2-W

    Article  Google Scholar 

  68. de-Meijer M (1989) The contribution of general features of body movement to the attribution of emotions. J Nonverbal Behav 1:247–268. https://doi.org/10.1007/BF00990296

    Article  Google Scholar 

  69. Embgen S, Luber M, Becker-Asano C et al (2012) Robot-specific social cues in emotional body language. In: Proceedings of the international workshop on robot and human interactive communication. IEEE, Paris, France, pp 1019–1025

  70. Saerbeck M, Bartneck C (2010) Perception of affect elicited by robot motion. In: Proceedings of 5th ACM/IEEE international conference on human–robot interaction. ACM/IEEE, Osaka, Japan, pp 53–60

  71. Gaur V, Scassellati B (2006) Which motion features induce the perception of animacy? In: Proceedings of 2006 IEEE international conference for …. IEEE, Bloomington, Indiana, pp 973–980

  72. Beck A, Canamero L, Bard KA (2010) Towards an affect space for robots to display emotional body language. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, Principe di Piemonte - Viareggio, Italy, pp 464–469

  73. Beck A, Cañamero L, Damiano L et al (2011) Children interpretation of emotional body language displayed by a robot. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 7072 LNAI:62–70. https://doi.org/10.1007/978-3-642-25504-5_7

    Chapter  Google Scholar 

  74. Beck A, Cañamero L, Hiolle A et al (2013) Interpretation of emotional body language displayed by a humanoid robot: a case study with children. Int J Soc Robot 5:325–334. https://doi.org/10.1007/s12369-013-0193-z

    Article  Google Scholar 

  75. Beck A, Hiolle A, Mazel A, Canamero L (2010) Interpretation of emotional body language displayed by robots. In: Proceedings of the 3rd international workshop on affective interaction in natural environments. Firenze, Italy, pp 37–42

  76. Moshkina L (2012) Improving request compliance through robot affect. In: AAAI Conference on Artificial Intelligence. AAAI, pp 2031–2037

  77. Van Den Brule R, Bijlstra G, Dotsch R et al (2016) Warning signals for poor performance improve human–robot interaction. J Hum Robot Interact 5:69–89. https://doi.org/10.5898/JHRI.5.2.Van_den_Brule

    Article  Google Scholar 

  78. Cook M (1977) Gaze and mutual gaze in social encounters. Am Sci 65:328–333

    Google Scholar 

  79. Mazur A, Rosa E, Faupel M et al (1980) Physiological aspects of communication via mutual gaze. AJS 86:50–74. https://doi.org/10.1086/227202

    Article  Google Scholar 

  80. Breazeal C, Kidd CD, Thomaz AL et al (2005) Effects of nonverbal communication on Efficiency and robustness of human–robot teamwork. In: Intelligent robots and systems, 2005. (IROS). IEEE, pp 708–713

  81. Skantze G, Hjalmarsson A, Oertel C (2013) Exploring the effects of gaze and pauses in situated human-robot interaction. Proc SIGDIAL 2013 Conf 163–172

  82. Stanton C, Stevens CJ (2014) Robot pressure: the impact of robot eye gaze and lifelike bodily movements upon decision-making and trust. In: Beetz M, Johnston B, Williams M-A (eds) Social robotics: 6th international conference, ICSR. Springer International Publishing, Sydney, Australia, pp 330–339

    Chapter  Google Scholar 

  83. Moon Aj, Troniak DM, Gleeson B, et al (2014) Meet me where i’m gazing: how shared attention gaze affects human-robot handover timing. In: ACM/IEEE International Conference on Human-Robot Interaction

  84. Zheng M, Moon A, Gleeson B et al (2014) Human behavioural responses to robot head gaze during robot-to-human handovers. In: International conference on robotics and biomimetics (ROBIO). IEEE, pp 362–367

  85. Andrew RJ (1965) The origins of facial expressions. Sci Am 213:88–94. https://doi.org/10.2307/24931158

    Article  Google Scholar 

  86. Thompson DF, Meltzer L (1964) Communication of emotional intent by facial expression. J Abnorm Soc Psychol 68:129–135. https://doi.org/10.1037/h0044598

    Article  Google Scholar 

  87. Buck RW, Savin VJ, Miller RE, Caul WF (1972) Communication of affect through facial expressions in humans. J Personal Soc Psychol 23:362–371. https://doi.org/10.1037/h0033171

    Article  Google Scholar 

  88. Gonsior B, Sosnowski S, Mayer C et al (2011) Improving aspects of empathy and subjective performance for HRI through mirroring facial expressions. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, pp 350–356

  89. Leite I, Pereira A, Mascarenhas S et al (2013) The influence of empathy in human–robot relations. Int J Hum Comput Stud 71:250–260. https://doi.org/10.1016/j.ijhcs.2012.09.005

    Article  Google Scholar 

  90. Endrass B, Haering M, Gasser A, Andre E (2014) Simulating deceptive cues of joy in humanoid robots. In: International conference on intelligent virtual agents. Springer, Cham, pp 174–177

    Chapter  Google Scholar 

  91. Hegel F, Spexard T, Wrede B et al (2006) Playing a different imitation game: interaction with an empathic android robot. In: Proceedings of the 6th IEEE-RAS international conference on humanoid robots. IEEE, pp 56–61

  92. Berns K, Hirth J (2006) Control of facial expressions of the humanoid robot. In: Intelligent Robots and Systems, 2006 IEEE/RSJ International Conference on. IEEE, pp 3119–3124

  93. Kobayashi H, Ichikawa Y, Senda M, Shiiba T (2003) Realization of realistic and rich facial expressions by face robot. In: Proceedings 2003 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE/RSJ, Las Vegas, Nevada, pp 1123–1128

  94. Allison B, Nejat G, Kao E (2009) The Design of an Expressive Humanlike Socially Assistive Robot. J Mech Robot 1(011001):1–8. https://doi.org/10.1115/1.2959097

    Article  Google Scholar 

  95. Cameron D, Millings A, Fernando S et al (2018) The effects of robot facial emotional expressions and gender on child–robot interaction in a field study. Conn Sci 30:343–361. https://doi.org/10.1080/09540091.2018.1454889

    Article  Google Scholar 

  96. Gordon G, Breazeal C (2014) Learning to maintain engagement : no one leaves a sad DragonBot. In: 2014 AAAI fall symposium series, pp 76–77

  97. Waller BM, Peirce K, Caeiro CC et al (2013) Paedomorphic facial expressions give dogs a selective advantage. PLoS ONE. https://doi.org/10.1371/journal.pone.0082686

    Article  Google Scholar 

  98. Chevalier P, Li JJ, Ainger E et al (2017) Dialogue design for a robot-based face-mirroring game to engage autistic children with emotional expressions. In: International conference on social robotics. Springer, Cham, pp 546–555

    Chapter  Google Scholar 

  99. Pour AG, Taheri A, Alemi M, Meghdari A (2018) Human–robot facial expression reciprocal interaction platform: case studies on children with autism. Int J Soc Robot 10:179–198. https://doi.org/10.1007/s12369-017-0461-4

    Article  Google Scholar 

  100. Reyes M, Meza I, Pineda LA (2016) The positive effect of negative feedback in HRI using a facial expression robot. In: International workshop in cultural robotics. Springer International Publishing, pp 44–54

  101. Hamancher A, Bianchi-Berthouze N, Pipe AG, Eder K (2016) Believing in BERT: using expressive communication to enhance trust and counteract operational error in physical human–robot interaction. In: Robot and human interactive communication (RO-MAN). IEEE, pp 493–500

  102. Cohen L, Khoramshahi M, Salesse RN et al (2017) Influence of facial feedback during a cooperative human–robot task in schizophrenia. Sci Rep 7:15023. https://doi.org/10.1038/s41598-017-14773-3

    Article  Google Scholar 

  103. Hall ET, Birdwhistell RL, Bock B et al (1968) Proxemics [and comments and replies]. Curr Anthropol 9:83–108. https://doi.org/10.1086/200975

    Article  Google Scholar 

  104. Cook M (1970) Experiments on orientation and proxemics. Hum Relat 23:61–76

    Article  Google Scholar 

  105. Sherman E (1973) Listening comprehension as a function of proxemic distance and eye-contact. Grad Res Urban Educ Relat Discip 5:5–34

    Google Scholar 

  106. Walters ML, Dautenhahn K, Te Boekhorst R et al (2005) The influence of subjects’ personality traits on personal spatial zones in a human–robot interaction experiment. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, pp 347–352

  107. Walters ML, Oskoei MA, Syrdal DS, Dautenhahn K (2011) A long-term human–robot proxemic study. In: Proceedings of the IEEE international workshop on robot and human interactive communication. IEEE, pp 137–142

  108. Shi D, Collins E, Donate A et al (2008) Human-aware robot motion planning with velocity constraints. In: 2008 international symposium on collaborative technologies and systems CTS 2008. IEEE, pp 490–497

  109. Mead R, Mataric MJ (2015) Proxemics and performance: subjective human evaluations of autonomous sociable robot distance and social signal understanding. In: IEEE international conference on intelligent robots and systems, pp 5984–5991

  110. Mead R, Matarić MJ (2016) Perceptual models of human–robot proxemics. Exp Robot 109:261–276. https://doi.org/10.1007/978-3-319-23778-7_18

    Article  Google Scholar 

  111. Koay KL, Syrdal DS, Ashgari-Oskoei M et al (2014) Social roles and baseline proxemic preferences for a domestic service robot. Int J Soc Robot 6:469–488. https://doi.org/10.1007/s12369-014-0232-4

    Article  Google Scholar 

  112. Kim Y, Mutlu B (2014) How social distance shapes human–robot interaction. Int J Hum Comput Stud 72:783–795. https://doi.org/10.1016/J.IJHCS.2014.05.005

    Article  Google Scholar 

  113. Papadopoulos F, Küster D, Corrigan LJ et al (2016) Do relative positions and proxemics affect the engagement in a human–robot collaborative scenario? Interact Stud 17:321–347. https://doi.org/10.1075/is.17.3.01pap

    Article  Google Scholar 

  114. Siegel MS (2009) Persuasive robotics how robots change our minds. Massachusetts Institute of Technology, Cambridge

    Google Scholar 

  115. Pacchierotti E, Christensen HI, Jensfelt P (2006) Evaluation of passing distance for social robots. In: Proceedings—IEEE international workshop on robot and human interactive communication, pp 315–320

  116. Butler JT, Agah A (2001) Psychological effects of behavior patterns of a mobile personal robot. Auton Robot 10(2):185–202

    Article  Google Scholar 

  117. Tsui KM, Desai M, Yanco HA (2010) Considering the Bystander’s perspective for indirect human–robot interaction. In: Proceedings of the 5th ACM/IEEE international conference on human robot interaction. IEEE, pp 129–130

  118. Gockley R, Forlizzi J, Simmons R (2007) Natural person following behavior for social robots. In: Proceedings of the ACM/IEEE international conference on human–robot interaction, pp 17–24

  119. Duncan SJ (1969) Nonverbal communication. Psychol Bull 72:118–137. https://doi.org/10.1177/1048371309331498

    Article  Google Scholar 

  120. Austin WM (1965) Some social aspects of paralanguage. Can J Linguist Can Linguist 11:31–39

    Article  Google Scholar 

  121. Chen TL, King C, Thomaz AL, Kemp CC (2011) Touched by a robot : an investigation of subjective responses to robot-initiated touch categories and subject descriptors. In: Proceedings of the 6th international conference on Human-robot interaction. ACM, Lausanne, Switzerland, pp 457–464

  122. Cramer H, Kemper NA, Amin A, Evers V (2009) The effects of robot touch and proactive behaviour on perceptions of human–robot interactions. In: Proceedings of the 4th ACM/IEEE international conference on human robot interaction. IEEE, pp 275–276

  123. Fukuda H, Shiomi M, Nakagawa K, Ueda K (2012) “Midas touch” in human–robot interaction. In: Proceedings of the seventh annual ACM/IEEE international conference on Human–Robot interaction—HRI’12. ACM Press, New York, pp 131–132

  124. Walker R, Bartneck C (2013) The pleasure of receiving a head massage from a robot. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, pp 807–813

  125. Willemse CJAM, Toet A, van Erp JBF (2017) Affective and behavioral responses to robot-initiated social touch: toward understanding the opportunities and limitations of physical contact in human–robot interaction. Front ICT 4:12. https://doi.org/10.3389/fict.2017.00012

    Article  Google Scholar 

  126. Yohanan S, MacLean KE (2012) The Role of Affective Touch in Human-Robot Interaction: human Intent and Expectations in Touching the Haptic Creature. Int J Soc Robot 4:163–180. https://doi.org/10.1007/s12369-011-0126-7

    Article  Google Scholar 

  127. Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178. https://doi.org/10.1037/h0077714

    Article  Google Scholar 

  128. Yohanan S, Maclean KE (2011) Design and assessment of the haptic creature’s affect display. In: Proceedings of the 6th international conference on human–robot interaction—HRI’11. ACM, Lausanne, Switzerland, pp 473–480

  129. Sefidgar YS, MacLean KE, Yohanan S et al (2016) Design and evaluation of a touch-centered calming interaction with a social robot. IEEE Trans Affect Comput 7:108–121. https://doi.org/10.1109/TAFFC.2015.2457893

    Article  Google Scholar 

  130. Yoshida N, Yonezawa T (2016) Investigating breathing expression of a stuffed-toy robot based on body-emotion model. In: Proceedings of the fourth international conference on human agent interaction—HAI’16. ACM, pp 139–144

  131. Yoshida N, Yonezawa T (2017) Physiological expression of robots enhancing users’ emotion in direct and indirect communication. In: International conference on human-agent interaction. ACM, pp 505–509

  132. Bucci P, Zhang L, Cang XL, MacLean KE (2018) Is it happy? Behavioural and narrative frame complexity impact perceptions of a simple furry robot’s emotions. In: Conference on human factors in computing systems. ACM Press, New York, pp 1–11

  133. Nakagawa K, Shiomi M, Shinozawa K et al (2011) Effect of robot’s active touch on people’s motivation. In: Proceedings of the 6th international conference on human–robot interaction. ACM, Lausanne, Switzerland, pp 465–472

  134. Shiomi M, Nakagawa K, Shinozawa K et al (2017) Does a robot’s touch encourage human effort? Int J Soc Robot 9:5–15. https://doi.org/10.1007/s12369-016-0339-x

    Article  Google Scholar 

  135. Van Erp JBF, Toet A (2015) Social touch in human computer interaction. Front Digit Humanit. https://doi.org/10.3389/fdigh.2015.00002

    Article  Google Scholar 

  136. Bruneau TJ (2012) Chronemics: time-binding and the construction of personal time. et Cetera 69:72–92

    Google Scholar 

  137. Samani HA, Cheok AD (2010) Probability of love between robots and humans. In: IEEE/RSJ 2010 international conference on intelligent robots and systems (IROS). IEEE, pp 5288–5293

  138. Mead R, Atrash A, Kaszubski E, et al (2014) Building blocks of social intelligence : enabling autonomy for socially intelligent and assistive robots. In: Association for the advancement of artificial intelligence fall symposium on artificial intelligence and human–robot interaction. AAAI, Arlington, Virginia, pp 110–112

  139. Moon A, Panton B, Van der Loos HFM, Croft EA (2010) Using hesitation gestures for safe and ethical human–robot interaction. In: Proceedings of the international conference on robotics and automation (ICRA), pp 11–13

  140. Moon A, Parker CAC, Croft EA, Van Der Loos HFM (2011) Did you see it hesitate empirically grounded design of hesitation. In: IEEE (ed) Intelligent robots and systems (IROS), pp 1994–1999

  141. Givens DB (2002) The nonverbal dictionary of gestures, signs & body language cues. Center for Nonverbal Studies Press, Spokane

    Google Scholar 

  142. Aj Moon, Parker CAC, Croft EA, Van der Loos HFM (2013) Design and impact of hesitation gestures during human–robot resource conflicts. Int J Hum Robot Interact 2:18–40. https://doi.org/10.5898/jhri.v2i3.49

    Article  Google Scholar 

  143. Higham JP, Hebets EA (2013) An introduction to multimodal communication. Behav Ecol Sociobiol 67:1381–1388. https://doi.org/10.1007/s00265-013-1590-x

    Article  Google Scholar 

  144. Si M, Mcdaniel JD (2016) Using facial expression and body language to express attitude for non-humanoid robot. In: Proceedings of the 15th international conference on autonomous agents and multiagent systems (AAMAS). IFAAMAS, Singapore, pp 1457–1458

  145. Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in human–robot interaction. In: International conference on intelligent robots and systems. IEEE, pp 5495–5502

  146. Mumm J, Mutlu B (2011) Human–robot proxemics : physical and psychological distancing in human–robot interaction. In: Proceedings of the 6th international conference on human–robot interaction. ACM, Lausanne, Switzerland, pp 331–338

  147. Chidambaram V, Chiang Y-H, Mutlu B (2012) Designing persuasive robots: how robots might persuade people using vocal and nonverbal cues. In: ACM/IEEE international conference human–robot interaction, pp 293–300. https://doi.org/10.1145/2157689.2157798

  148. Lafferty JC, Eady PM, Elmers J (1974) The desert survival problem: a group decision making experience for examining and increasing individual and team effectiveness. In: Experimental learning methods. Plymouth, Michigan

  149. Zecca M, Mizoguchi Y, Endo K et al (2009) Whole body emotion expressions for KOBIAN humanoid robot—preliminary experiments with different emotional patterns. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, pp 381–386

  150. Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot 3:125–142. https://doi.org/10.1007/s12369-010-0071-x

    Article  Google Scholar 

  151. Erden MS (2013) Emotional postures for the humanoid–robot Nao. Int J Soc Robot 5:441–456. https://doi.org/10.1007/s12369-013-0200-4

    Article  Google Scholar 

  152. Ekman P, Friesen WV (1978) Manual for the facial action coding system. Consulting Psychologists Press, Washington

    Google Scholar 

  153. Gácsi M, Kis A, Faragó T et al (2016) Humans attribute emotions to a robot that shows simple behavioural patterns borrowed from dog behaviour. Comput Hum Behav 59:411–419. https://doi.org/10.1016/J.CHB.2016.02.043

    Article  Google Scholar 

  154. Riek LD, Paul PC, Robinson P (2010) When my robot smiles at me: enabling human–robot rapport via real-time head gesture mimicry. J Multimodal User Interfaces 3:99–108. https://doi.org/10.1007/s12193-009-0028-2

    Article  Google Scholar 

  155. Iio T, Shiomi M, Shinozawa K et al (2011) investigating entrainment of people’s pointing gestures by robot’s gestures using a WOZ method. Int J Soc Robot 3:405–414. https://doi.org/10.1007/s12369-011-0112-0

    Article  Google Scholar 

  156. Moshkina L, Trickett S, Trafton JG (2014) Social engagement in public places. In: Proceedings of the 2014 ACM/IEEE international conference on human–robot interaction—HRI’14. ACM, pp 382–389

  157. Boucher JD, Pattacini U, Lelong A et al (2012) I reach faster when i see you look: gaze effects in human–human and human–robot face-to-face cooperation. Front Neurorobot 6:1–11. https://doi.org/10.3389/fnbot.2012.00003

    Article  Google Scholar 

  158. Admoni H, Weng T, Hayes B, Scassellati B (2016) Robot nonverbal behavior improves task performance in difficult collaborations. In: International conference on human–robot interaction. IEEE, pp 51–58

  159. Kennedy J, Baxter P, Belpaeme T (2017) Nonverbal immediacy as a characterisation of social behaviour for human–robot interaction. Int J Soc Robot 9:109–128. https://doi.org/10.1007/s12369-016-0378-3

    Article  Google Scholar 

  160. Mehrabian A (1968) Some referents and measures of nonverbal behavior. Behav Res Methods Instrum 1:203–207. https://doi.org/10.3758/BF03208096

    Article  Google Scholar 

  161. Lohse M, Rothuis R, Gallego-Pérez J et al (2014) Robot gestures make difficult tasks easier. In: Proceedings of the 32nd annual ACM conference on human factors in computing systems—CHI’14. ACM, pp 1459–1466

  162. McCallum L, McOwan PW (2015) Face the music and glance: how nonverbal behaviour aids human robot relationships based in music. In: International conference on human–robot interaction. IEEE/ACM, New York, pp 237–244

  163. Aronson E, Willerman B, Floyd J (1966) The effect of a pratfall on increasing interpersonal attractiveness. Psychon Sci 4:227–228. https://doi.org/10.3758/BF03342263

    Article  Google Scholar 

  164. Mirnig N, Stollnberger G, Miksch M et al (2017) To err is robot: how humans assess and act toward an erroneous social robot. Front Robot AI 4:1–15. https://doi.org/10.3389/frobt.2017.00021

    Article  Google Scholar 

  165. Biswas M, Murray JC (2015) Towards an imperfect robot for long-term companionship: case studies using cognitive biases. In: IEEE international conference on intelligent robots and systems. IEEE, pp 5978–5983

  166. Hamacher A, Bianchi-Berthouze N, Pipe AG, Eder K (2016) Believing in BERT: using expressive communication to enhance trust and counteract operational error in physical human–robot interaction. In: 25th IEEE international symposium on robot and human interactive communication, RO-MAN 2016, pp 493–500

  167. Burgoon JK, Coker DA, Coker RA (1986) Communicative effects of gaze behavior. Hum Commun Res 12:495–524. https://doi.org/10.1111/j.1468-2958.1986.tb00089.x

    Article  Google Scholar 

  168. Kleinke CL (1986) Gaze and eye contact: a research review. Pshychol Bull 100:78–100

    Article  Google Scholar 

  169. Hehman E, Leitner JB, Gaertner SL (2013) Enhancing static facial features increases intimidation. J Exp Soc Psychol 49:747–754. https://doi.org/10.1016/j.jesp.2013.02.015

    Article  Google Scholar 

  170. Kahneman D (2014) Thinking fast and slow. 1–9

  171. Huhn JM III, Potts CA, Rosenbaum DA (2016) Cognitive framing in action. Cognition 151:42–51

    Article  Google Scholar 

  172. Kelley CR (1968) The role of man in automatic control processes. Manual and automatic control. Wiley, New York, pp 232–250

    Google Scholar 

  173. Leite I, Castellano G, Pereira A et al (2014) empathic robots for long-term interaction: evaluating social presence, engagement and perceived support in children. Int J Soc Robot 6:329–341. https://doi.org/10.1007/s12369-014-0227-1

    Article  Google Scholar 

  174. Leite I, Martinho C, Pereira A, Paiva A (2009) As time goes by: long-term evaluation of social presence in robotic companions. In: Proceedings—IEEE international workshop on robot and human interactive communication. IEEE, pp 669–674

  175. Gockley R, Bruce A, Forlizzi J et al (2005) Designing robots for long-term social interaction. In: International conference on intelligent robots and systems. IEEE, pp 2199–2204

  176. Xu J, Broekens J, Hindriks K, Neerincx MA (2014) Robot mood is contagious: effects of robot body language in the imitation game. In: Proceedings of 2014 international conference autonomous agents multi-agent Systems, pp 973–980

  177. Hatfield E, Rapson RL, Le Y-CL (2009) Social contagion and empathy. In: Decety J, Ickes W (eds) The social neuroscience of empathy. MIT Press, Cambridge, pp 19–30

    Chapter  Google Scholar 

  178. Barger PB, Grandey AA, Barger PB, Grandey AA (2017) Service with a smile and encounter satisfaction: emotional contagion and appraisal mechanisms. Acad Manag J 49:1229–1238

    Article  Google Scholar 

  179. Pugh SD (2018) Service with a smile: emotional contagion in the service encounter. Acad Manag 44:1018–1027

    Google Scholar 

  180. Sullins ES (1991) emotional contagion revisited: effects of social comparison and expressive style on mood convergence. Personal Soc Psychol Bull 17:166–174. https://doi.org/10.1177/014616729101700208

    Article  Google Scholar 

  181. Pessoa L (2005) To what extent are emotional visual stimuli processed without attention and awareness? Curr Opin Neurobiol 15:188–196. https://doi.org/10.1016/j.conb.2005.03.002

    Article  Google Scholar 

  182. Duffy KA, Chartrand TL (2015) Mimicry: causes and consequences. Curr Opin Behav Sci 3:112–116

    Article  Google Scholar 

  183. Chartrand TL, van Baaren R (2009) Human mimicry. In: Advances in experimental social psychology, pp 219–274

    Google Scholar 

  184. Oxford University Press (2018) Definition of psychosocial in english by Oxford Dictionaries. In: Oxford english dict. online. https://en.oxforddictionaries.com/definition/psychosocial. Accessed 5 Feb 2018

  185. Steinfeld A, Fong T, Kaber D et al (2006) Common metrics for human–robot interaction. In: Proceeding of the 1st ACM SIGCHI/SIGART conference on human–robot interaction. ACM, pp 33–40

  186. Tan JTC, Duan F, Zhang Y et al (2009) Human–robot collaboration in cellular manufacturing: design and development. In: IEEE/RSJ international conference on intelligent robots and systems. IEEE, pp 29–34

  187. Kiselev A, Loutfi A (2012) Using a mental workload index as a measure of usability of a user interface for social robotic telepresence. In: 2nd workshop of social robotic telepresence in conjunction with IEEE international symposium on robot and human interactive communication

  188. Murphy RR (2004) Human–robot interaction in rescue robotics. IEEE Trans Syst Man Cybern Part C Appl Rev 34:138–153. https://doi.org/10.1109/TSMCC.2004.826267

    Article  Google Scholar 

  189. Yanco HA, Drury J (2004) Where am I? Acquiring situation awareness using a remote robot platform. In: IEEE international conference on systems, man and cybernetics. IEEE, pp 2835–2840

  190. Kaber DB, Onal E, Endsley MR (2000) Design of automation for telerobots and the effect on performance, operator situation awareness, and subjective workload. Hum Factors Ergon Manuf 10:409–430. https://doi.org/10.1002/1520-6564(200023)10:4%3c409:AID-HFM4%3e3.3.CO;2-M

    Article  Google Scholar 

  191. Riley JM, Kaber DB, Draper JV (2004) Situation awareness and attention allocation measures for quantifying telepresence experiences in teleoperation. Hum Factors Ergon Manuf Serv Ind 14:51–67. https://doi.org/10.1002/hfm.10050

    Article  Google Scholar 

  192. Bauer A, Wollherr D, Buss M (2007) Human–robot collaboration: a survey. Int J Humanoid Robot 5:47–66

    Article  Google Scholar 

  193. Burgoon JK, Dunbar N, Segrin C (2002) Nonverbal influence. In: The Persuasion handbook: developments in theory and practice, pp 445–473

  194. Burgoon JK, Birk T, Pfau M (1990) Nonverbal behaviors, Persuasion, and credibility. Hum Commun Res 17:140–169. https://doi.org/10.1111/j.1468-2958.1990.tb00229.x

    Article  Google Scholar 

  195. Geiskkovitch DY, Cormier D, Seo SH, Young JE (2016) Please continue, we need more data: an exploration of obedience to robots. J Hum Robot Interact 5:82–99. https://doi.org/10.5898/JHRI.5.1.Geiskkovitch

    Article  Google Scholar 

  196. Bartneck C, Reichenbach J, Carpenter J (2008) The carrot and the stick: the role of praise and punishment in human–robot interaction. Interact Stud 9:179–203. https://doi.org/10.1075/is.9.2.03bar

    Article  Google Scholar 

  197. Goetz J, Kiesler S, Powers A (2003) Matching robot appearance and behavior to tasks to improve human–robot cooperation. IEEE Int Work Robot Hum Interact Commun. https://doi.org/10.1109/ROMAN.2003.1251796

    Article  Google Scholar 

  198. Disalvo CF, Gemperle F, Forlizzi J, Kiesler S (2002) All robots are not created equal: the design and perception of humanoid robot heads. In: Designing interactive systems: processes, practices, methods, and techniques. ACM, pp 321–326

  199. Duffy BR (2003) Anthropomorphism and the social robot. Robot Auton Syst 42:177–190. https://doi.org/10.1016/S0921-8890(02)00374-3

    Article  MATH  Google Scholar 

  200. Li AX, Florendo M, Miller LE et al (2015) Robot form and motion influences social attention. In: Proceedings of the tenth annual ACM/IEEE international conference on human–robot interaction, pp 43–50

  201. Mori M, MacDorman KF, Kageki N (2012) The uncanny valley. IEEE Robot Autom Mag 19:98–100. https://doi.org/10.1109/MRA.2012.2192811

    Article  Google Scholar 

  202. Bainbridge WA, Hart J, Kim ES, Scassellati B (2008) The effect of presence on human–robot interaction. In: 17th IEEE international symposium on robot and human interactive communication. IEEE, pp 701–706

  203. Li J (2015) The benefit of being physically present: a survey of experimental works comparing copresent robots, telepresent robots and virtual agents. Int J Hum Comput Stud 77:23–37. https://doi.org/10.1016/j.ijhcs.2015.01.001

    Article  Google Scholar 

  204. Walters ML, Syrdal DS, Dautenhahn K et al (2008) Avoiding the uncanny valley: robot appearance, personality and consistency of behavior in an attention-seeking home scenario for a robot companion. Auton Robots 24:159–178. https://doi.org/10.1007/s10514-007-9058-3

    Article  Google Scholar 

  205. Fink J (2012) Anthropomorphism and human likeness in the design of robots and human–robot interaction. In: Proceedings of the international conference on social robotics (ICSR). Springer-Verlag, pp 199–208

  206. Bartneck C, Kanda T, Mubin O, Al Mahmud A (2009) Does the design of a robot influence its animacy and perceived intelligence? Int J Soc Robot 1:195–204. https://doi.org/10.1007/s12369-009-0013-7

    Article  Google Scholar 

  207. Paauwe RA, Hoorn JF, Konijn EA, Keyson DV (2015) Designing robot embodiments for social interaction: affordances topple realism and aesthetics. Int J Soc Robot 7:697–708. https://doi.org/10.1007/s12369-015-0301-3

    Article  Google Scholar 

  208. Blow M, Dautenhahn K, Appleby A, et al (2006) The art of designing robot faces. In: SIGCHI/SIGART conference on human–robot interaction. ACM, pp 331–339

Download references

Funding

This work was funded by the AGE-WELL Networks of Centres of Excellence (NCE) program, Canada Research Chairs (CRC) program, the Vanier Canada Graduate Scholarship (CGS) program, and the Ontario Graduate Scholarship (OGS) program.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shane Saunderson.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Saunderson, S., Nejat, G. How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human–Robot Interaction. Int J of Soc Robotics 11, 575–608 (2019). https://doi.org/10.1007/s12369-019-00523-0

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-019-00523-0

Keywords

Navigation