Abstract
Researchers have shown it is possible to develop robots that can produce recognizable emotional facial expressions [1, 2]. However, although human emotional expressions are known to be influenced by the surrounding context [7], there has been little research into the effect of context on the recognition of robot emotional expressions. The experiment reported here demonstrates that classical music can affect judgments of a robot’s emotional facial expressions. Different judgments were made depending on whether the music was emotionally congruent or incongruent with the robot’s expressions. A robot head produced sequences of expressions that were designed to demonstrate positive or negative emotions. The expressions were more likely to be recognized as intended when they occurred with music of a similar valence. Interestingly, it was observed that the robot face also influenced judgments about the classical music. Design implications for believable emotional robots are drawn.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Goris, K., Saldien, J., Vanderniepen, I., Lefeber, D.: The Huggable Robot Probo, a Multi-disciplinary Research Platform. In: Proceedings of the EUROBOT Conference 2008, Heidelberg, Germany, pp. 63–68 (2008)
Breazeal, C.L.: Designing Sociable Robots. In: A Bradford Book. The MIT Press (2002)
Ekman, P., Friesen, W.: Facial Action Coding System. Consulting Psychologists Press (1978)
Ekman, P., Friesen, W., Hager, J.: Facial Action Coding System, Research Nexus, Salt Lake City, Utah (2002)
Russell, J.: Reading emotions from and into faces: resurrecting a dimensional–contextual perspective. In: Russell, J., Fernandez-Dols, J. (eds.) The Psychology of Facial Expression, pp. 295–320. Cambridge University Press, Cambridge (1997)
Posner, J., Russell, J., Peterson, B.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(03), 715–734 (2005)
Niedenthal, P.M., Kruth-Gruber, S., Ric, F.: What Information Determines the Recognition of Emotion? In: The Psychology of Emotion: Interpersonal Experiential, and Cognitive Approaches. Principles of Social Psychology series, pp. 136–144. Psychology Press, New York (2006)
Izard, C.E.: The face of emotion. Appleton-Century-Crofts, New York (1971)
Ekman, P.: Universals and cultural differences in facial expressionsof emotion. In: Cole, J.K. (ed.) Nebraska Symposium on Motivation, vol. 19, pp. 207–283. University of Nebraska Press, Lincoln (1972)
Carroll, J.M., Russell, J.A.: Do facial expressions signal specific emotions? Judging emotion from the face in context. Journal of Personality and Social Psychology 70, 205–218 (1996)
de Gelder, B., Vroomen, J.: The perception of emotions by ear and by eye. Cognition & Emotion 14, 289–311 (2000)
Hong, P., Wen, Z., Huang, T.: Real-time speech driven expressive synthetic talking faces using neural networks. In: IEEE Transaction on Neural Networks (April 2002)
Noël, S., Dumoulin, S., Lindgaard, G.: Interpreting Human and Avatar Facial Expressions. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5726, pp. 98–110. Springer, Heidelberg (2009)
Creed, C., Beale, R.: Psychological Responses to Simulated Displays of Mismatched Emotional Expressions. Interacting with Computers 20(2), 225–239 (2008)
Mower, E., Lee, S., Matariric, M.J., Narayanan, S.: Human perception of synthetic character emotions in the presence of conflicting and congruent vocal and facial expressions. In: IEEE Int. Conf. Acoustics, Speech, and Signal Processing (ICASSP 2008), Las Vegas, NV, pp. 2201–2204 (2008)
Mower, E., Matarić, M.J., Narayanan, S.: Human perception of audio-visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Transactions on Multimedia 11(5) (2009)
Zhang, J., Sharkey, A.J.C.: Contextual Recognition of Robot Emotions. In: Groß, R., Alboul, L., Melhuish, C., Witkowski, M., Prescott, T.J., Penders, J. (eds.) TAROS 2011. LNCS (LNAI), vol. 6856, pp. 78–89. Springer, Heidelberg (2011)
Bradley, M.M., Lang, P.J.: The International Affective Picture System (IAPS) in the Study of Emotion and Attention. In: Coan, J.A., Allen, J.B. (eds.) The Handbook of Emotion Elicitation and Assessment, pp. 29–46 (2007)
Eich, E., Ng, J.T.W., Macaulay, D., Percy, A.D., Grebneva, I.: Combining music with thought to change mood. In: Coan, J.A., Allen, J.B. (eds.) The Handbook of Emotion Elicitation and Assessment, pp. 124–136. Oxford University Press, New York (2007)
Becker-Asano, C., Ishiguro, H.: Evaluating facial displays of emotion for the android robot Geminoid F. In: IEEE SSCI Workshop on Affective Computational Intelligence, pp. 22–29 (2011)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, J., Sharkey, A.J.C. (2011). Listening to Sad Music While Seeing a Happy Robot Face. In: Mutlu, B., Bartneck, C., Ham, J., Evers, V., Kanda, T. (eds) Social Robotics. ICSR 2011. Lecture Notes in Computer Science(), vol 7072. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25504-5_18
Download citation
DOI: https://doi.org/10.1007/978-3-642-25504-5_18
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-25503-8
Online ISBN: 978-3-642-25504-5
eBook Packages: Computer ScienceComputer Science (R0)