Nothing Special   »   [go: up one dir, main page]

Skip to main content

Listening to Sad Music While Seeing a Happy Robot Face

  • Conference paper
Social Robotics (ICSR 2011)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7072))

Included in the following conference series:

Abstract

Researchers have shown it is possible to develop robots that can produce recognizable emotional facial expressions [1, 2]. However, although human emotional expressions are known to be influenced by the surrounding context [7], there has been little research into the effect of context on the recognition of robot emotional expressions. The experiment reported here demonstrates that classical music can affect judgments of a robot’s emotional facial expressions. Different judgments were made depending on whether the music was emotionally congruent or incongruent with the robot’s expressions. A robot head produced sequences of expressions that were designed to demonstrate positive or negative emotions. The expressions were more likely to be recognized as intended when they occurred with music of a similar valence. Interestingly, it was observed that the robot face also influenced judgments about the classical music. Design implications for believable emotional robots are drawn.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Goris, K., Saldien, J., Vanderniepen, I., Lefeber, D.: The Huggable Robot Probo, a Multi-disciplinary Research Platform. In: Proceedings of the EUROBOT Conference 2008, Heidelberg, Germany, pp. 63–68 (2008)

    Google Scholar 

  2. Breazeal, C.L.: Designing Sociable Robots. In: A Bradford Book. The MIT Press (2002)

    Google Scholar 

  3. Ekman, P., Friesen, W.: Facial Action Coding System. Consulting Psychologists Press (1978)

    Google Scholar 

  4. Ekman, P., Friesen, W., Hager, J.: Facial Action Coding System, Research Nexus, Salt Lake City, Utah (2002)

    Google Scholar 

  5. Russell, J.: Reading emotions from and into faces: resurrecting a dimensional–contextual perspective. In: Russell, J., Fernandez-Dols, J. (eds.) The Psychology of Facial Expression, pp. 295–320. Cambridge University Press, Cambridge (1997)

    Chapter  Google Scholar 

  6. Posner, J., Russell, J., Peterson, B.: The circumplex model of affect: an integrative approach to affective neuroscience, cognitive development, and psychopathology. Dev Psychopathol 17(03), 715–734 (2005)

    Article  Google Scholar 

  7. Niedenthal, P.M., Kruth-Gruber, S., Ric, F.: What Information Determines the Recognition of Emotion? In: The Psychology of Emotion: Interpersonal Experiential, and Cognitive Approaches. Principles of Social Psychology series, pp. 136–144. Psychology Press, New York (2006)

    Google Scholar 

  8. Izard, C.E.: The face of emotion. Appleton-Century-Crofts, New York (1971)

    Google Scholar 

  9. Ekman, P.: Universals and cultural differences in facial expressionsof emotion. In: Cole, J.K. (ed.) Nebraska Symposium on Motivation, vol. 19, pp. 207–283. University of Nebraska Press, Lincoln (1972)

    Google Scholar 

  10. Carroll, J.M., Russell, J.A.: Do facial expressions signal specific emotions? Judging emotion from the face in context. Journal of Personality and Social Psychology 70, 205–218 (1996)

    Article  Google Scholar 

  11. de Gelder, B., Vroomen, J.: The perception of emotions by ear and by eye. Cognition & Emotion 14, 289–311 (2000)

    Article  Google Scholar 

  12. Hong, P., Wen, Z., Huang, T.: Real-time speech driven expressive synthetic talking faces using neural networks. In: IEEE Transaction on Neural Networks (April 2002)

    Google Scholar 

  13. Noël, S., Dumoulin, S., Lindgaard, G.: Interpreting Human and Avatar Facial Expressions. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5726, pp. 98–110. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  14. Creed, C., Beale, R.: Psychological Responses to Simulated Displays of Mismatched Emotional Expressions. Interacting with Computers 20(2), 225–239 (2008)

    Article  Google Scholar 

  15. Mower, E., Lee, S., Matariric, M.J., Narayanan, S.: Human perception of synthetic character emotions in the presence of conflicting and congruent vocal and facial expressions. In: IEEE Int. Conf. Acoustics, Speech, and Signal Processing (ICASSP 2008), Las Vegas, NV, pp. 2201–2204 (2008)

    Google Scholar 

  16. Mower, E., Matarić, M.J., Narayanan, S.: Human perception of audio-visual synthetic character emotion expression in the presence of ambiguous and conflicting information. IEEE Transactions on Multimedia 11(5) (2009)

    Google Scholar 

  17. Zhang, J., Sharkey, A.J.C.: Contextual Recognition of Robot Emotions. In: Groß, R., Alboul, L., Melhuish, C., Witkowski, M., Prescott, T.J., Penders, J. (eds.) TAROS 2011. LNCS (LNAI), vol. 6856, pp. 78–89. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  18. Bradley, M.M., Lang, P.J.: The International Affective Picture System (IAPS) in the Study of Emotion and Attention. In: Coan, J.A., Allen, J.B. (eds.) The Handbook of Emotion Elicitation and Assessment, pp. 29–46 (2007)

    Google Scholar 

  19. Eich, E., Ng, J.T.W., Macaulay, D., Percy, A.D., Grebneva, I.: Combining music with thought to change mood. In: Coan, J.A., Allen, J.B. (eds.) The Handbook of Emotion Elicitation and Assessment, pp. 124–136. Oxford University Press, New York (2007)

    Google Scholar 

  20. Becker-Asano, C., Ishiguro, H.: Evaluating facial displays of emotion for the android robot Geminoid F. In: IEEE SSCI Workshop on Affective Computational Intelligence, pp. 22–29 (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, J., Sharkey, A.J.C. (2011). Listening to Sad Music While Seeing a Happy Robot Face. In: Mutlu, B., Bartneck, C., Ham, J., Evers, V., Kanda, T. (eds) Social Robotics. ICSR 2011. Lecture Notes in Computer Science(), vol 7072. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25504-5_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-25504-5_18

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-25503-8

  • Online ISBN: 978-3-642-25504-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics