Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

Exploring the Effect of Motion Type and Emotions on the Perception of Gender in Virtual Humans

Published: 21 July 2015 Publication History

Abstract

In this article, we investigate the perception of gender from the motion of virtual humans under different emotional conditions and explore the effect of emotional bias on gender perception (e.g., anger being attributed to males more than females). As motion types can present different levels of physiological cues, we also explore how two types of motion (walking and conversations) are affected by emotional bias. Walking typically displays more physiological cues about gender (e.g., hip sway) and therefore is expected to be less affected by emotional bias. To investigate these effects, we used a corpus of captured facial and body motions from four male and four female actors, performing basic emotions through conversation and walk. We expected that the appearance of the model would also influence gender perception; therefore, we displayed both male and female motions on two virtual models of different sex. Two experiments were then conducted to assess gender judgments from these motions. In both experiments, participants were asked to rate how male or female they considered the motions to be under different emotional states, then classified the emotions to determine how accurately they were portrayed by actors. Overall, both experiments showed that gender ratings were affected by the displayed emotion. However, we found that conversations were influenced by gender stereotypes to a greater extent than walking motions. This was particularly true for anger, which was perceived as male on both male and female motions, and sadness, which was perceived as less male when portrayed by male actors. We also found a slight effect of the model when observing gender on different types of virtual models. These results have implications for the design and animation of virtual humans.

Supplementary Material

zibrek (zibrek.zip)
Supplemental movie and image files for, Perceptual Tolerance to Stereoscopic 3D Image Distortion

References

[1]
Anthony P. Atkinson, Winand H. Dittrich, Andrew J. Gemmell, and Andrew W. Young. 2004. Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 6, 717--746.
[2]
Hillel Aviezer, Yaacov Trope, and Alexander Todorov. 2012. Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338, 6111, 1225--1229.
[3]
John N. Bassili. 1978. Facial motion in the perception of faces and of emotional expression. Journal of Experimental Psychology—Human Perception and Performance 4, 373--379.
[4]
Alberto Battocchi, Fabio Pianesi, and Dina Goren-Bar. 2005. A first evaluation study of a database of kinetic facial expressions (DaFEx). In Proceedings of the 7th International Conference on Multimodal Interfaces (ICMI’05). 214--221.
[5]
Boaz M. Ben-David, Pascal H. H. M. van Lieshout, and Talia Leszcz. 2011. A resource of validated affective and neutral sentences to assess identification of emotion in spoken language after a brain injury. Brain Injury 25, 2, 206--220.
[6]
Marilynn B. Brewer. 1988. A Dual Process Model of Impression Formation. Lawrence Erlbaum, Hillsdale, NJ.
[7]
Leslie R. Brody and Judith A. Hall. 2000. Gender, emotion, and expression. In Handbook of Emotions, M. Lewis and J. Haviland-Jones (Eds.). Guilford, 338--349.
[8]
Thierry Chaminade, Jessica Hodgins, and Mitsuo Kawato. 2007. Anthropomorphism influences perception of computer-animated characters’ actions. Social Cognitive and Affective Neuroscience 2, 3, 206--216.
[9]
Céline Clavel, Justine Plessier, Jean-Claude Martin, Laurent Ach, and Benoit Morel. 2009. Combining facial and postural expressions of emotions in a virtual character. In Proceedings of the 9th International Conference on Intelligent Virtual Agents (IVA’09). 287--300.
[10]
Elizabeth Crane and Melissa Gross. 2007. Motion capture and emotion: Affect detection in whole body movement. In Affective Computing and Intelligent Interaction. Lecture Notes in Computer Science, Vol. 4738. Springer, 95--101.
[11]
James E. Cutting. 1978. Generation of synthetic male and female walkers through manipulation of a biomechanical invariant. Perception 7, 4, 393--405.
[12]
Marco De Meijer. 1989. The contribution of general features of body movement to the attribution of emotions. Journal of Nonverbal Behavior 13, 4, 247--268.
[13]
Kay Deaux. 1993. Commentary: Sorry, wrong number: A reply to gentile’s call. Psychological Science 4, 2, 125--126.
[14]
Paul Ekman. 1992. An argument for basic emotions. Cognition and Emotion 6, 3--4, 169--200.
[15]
Cathy Ennis, Ludovic Hoyet, Arjan Egges, and Rachel McDonnell. 2013. Emotion capture: Emotionally expressive characters for games. In Proceedings of Motion on Games (MIG’13). ACM, New York, NY, 53--60.
[16]
Agneta H. Fischer. 1993. Sex differences in emotionality: Fact or stereotype? Feminism and Psychology 3, 3, 303--318.
[17]
Agneta H. Fischer, Patricia M. Rodriguez Mosquera, Annelies E. M. Van Vianen, and Antony S. R. Manstead. 2004. Gender and culture differences in emotion. Emotion 4, 1, 87--94.
[18]
Ursula Hess, Reginald B. Adams, and Robert E. Kleck. 2004. Facial appearance, gender, and emotion expression. Emotion 4, 4, 378--388.
[19]
Harold Hill, Yuri Jinno, and Alan Johnston. 2003. Comparing solid-body with point-light animations. Perception 32, 5, 561--566.
[20]
Harold Hill and Alan Johnston. 2001. Categorizing sex and identity from the biological motion of faces. Current Biology 11, 11, 880--885.
[21]
Jessica Hodgins, Sophie Jörg, Carol O’Sullivan, Sang Il Park, and Moshe Mahler. 2010. The saliency of anomalies in animated human characters. ACM Transactions on Applied Perception 7, 4, 22:1--22:14.
[22]
Nadine Hugill, Bernhard Fink, and Nick Neave. 2010. The role of human body movements in mate selection. Evolutionary Psychology 8, 1, 66--89.
[23]
Gunnar Johansson. 1973. Visual perception of biological motion and a model for its analysis. Perception and Psychophysics 14, 2, 201--211.
[24]
Kerri L. Johnson, Lawrie S. McKay, and Frank E. Pollick. 2011. He throws like a girl (but only when he’s sad): Emotion affects sex-decoding of biological motion displays. Cognition 119, 2, 265--280.
[25]
Kerri L. Johnson and Louis G. Tassinary. 2005. Perceiving sex directly and indirectly meaning in motion and morphology. Psychological Science 16, 11, 890--897.
[26]
Sophie Jörg, Jessica Hodgins, and Carol O’Sullivan. 2010. The perception of finger motions. In Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization. 129--133.
[27]
Lynn T. Kozlowski and James E. Cutting. 1977. Recognizing the sex of a walker from a dynamic point-light display. Perception and Psychophysics 21, 6, 575--580.
[28]
Rachel McDonnell, Sophie Jörg, Jessica K. Hodgins, Fiona Newell, and Carol O’Sullivan. 2009. Evaluating the effect of motion and body shape on the perceived sex of virtual characters. ACM Transactions on Applied Perception 5, 4, 20:1--20:14.
[29]
Edward R. Morrison, Lisa Gralewski, Neill Campbell, and Ian S. Penton-Voak. 2007. Facial movement varies by sex and is related to attractiveness. Evolution and Human Behavior 28, 3, 186--192.
[30]
E. Ashby Plant, Janet Shibley Hyde, Dacher Keltner, and Patricia G. Devine. 2000. The gender stereotyping of emotions. Psychology of Women Quarterly 24, 1, 81--92.
[31]
Frank E. Pollick, Helena M. Paterson, Armin Bruderlin, and Anthony J. Sanford. 2001. Perceiving affect from arm movement. Cognition 82, 2, B51--B61.
[32]
Philippe G. Schyns, Lucy S. Petro, and Marie L. Smith. 2009. Transmission of facial expressions of emotion co-evolved with their efficient decoding in the brain: Behavioral and brain evidence. PLoS One 4, 5, e5625.
[33]
Rhoda Kesler Unger. 1979. Toward a redefinition of sex and gender. American Psychologist 34, 11, 1085--1094.
[34]
Ekaterina P. Volkova, Betty J. Mohler, Trevor J. Dodds, Joachim Tesch, and Heinrich H. Bülthoff. 2014. Emotion categorization of body expressions in narrative scenarios. Frontiers in Psychology 5, 623, 1--11.
[35]
Anna C. Wellerdiek, Markus Leyrer, Ekaterina Volkova, Dong-Seon Chang, and Betty Mohler. 2013. Recognizing your own motions on virtual avatars: Is it me or not? In Proceedings of the ACM Symposium on Applied Perception (SAP’13). 138.

Cited By

View all
  • (2024)Brain mechanisms involved in the perception of emotional gait: A combined magnetoencephalography and virtual reality studyPLOS ONE10.1371/journal.pone.029910319:3(e0299103)Online publication date: 29-Mar-2024
  • (2024)Exploring the challenges of avoiding collisions with virtual pedestrians using a dual-task paradigm in individuals with chronic moderate to severe traumatic brain injuryJournal of NeuroEngineering and Rehabilitation10.1186/s12984-024-01378-x21:1Online publication date: 16-May-2024
  • (2024)Advanced Virtual Human Modeling with Metahumans: Focus on Genderless Characters2024 37th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI)10.1109/SIBGRAPI62404.2024.10716267(1-5)Online publication date: 30-Sep-2024
  • Show More Cited By

Index Terms

  1. Exploring the Effect of Motion Type and Emotions on the Perception of Gender in Virtual Humans

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Applied Perception
    ACM Transactions on Applied Perception  Volume 12, Issue 3
    July 2015
    92 pages
    ISSN:1544-3558
    EISSN:1544-3965
    DOI:10.1145/2798084
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 21 July 2015
    Accepted: 01 April 2015
    Revised: 01 April 2015
    Received: 01 November 2014
    Published in TAP Volume 12, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Facial animation
    2. emotions
    3. gender

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    • Captavatar
    • Science Foundation Ireland as part of the Cartoon Motion

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)55
    • Downloads (Last 6 weeks)7
    Reflects downloads up to 16 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Brain mechanisms involved in the perception of emotional gait: A combined magnetoencephalography and virtual reality studyPLOS ONE10.1371/journal.pone.029910319:3(e0299103)Online publication date: 29-Mar-2024
    • (2024)Exploring the challenges of avoiding collisions with virtual pedestrians using a dual-task paradigm in individuals with chronic moderate to severe traumatic brain injuryJournal of NeuroEngineering and Rehabilitation10.1186/s12984-024-01378-x21:1Online publication date: 16-May-2024
    • (2024)Advanced Virtual Human Modeling with Metahumans: Focus on Genderless Characters2024 37th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI)10.1109/SIBGRAPI62404.2024.10716267(1-5)Online publication date: 30-Sep-2024
    • (2023)Revisiting Micro and Macro Expressions in Computer Graphics CharactersProceedings of the 22nd Brazilian Symposium on Games and Digital Entertainment10.1145/3631085.3631228(38-45)Online publication date: 6-Nov-2023
    • (2023)Untangling the confounded relationship between perceiver sex and walker sex on the bistable perception of emotion displaying point-light walkersActa Psychologica10.1016/j.actpsy.2023.103993239(103993)Online publication date: Sep-2023
    • (2023) Social emotional processes during the third wave of COVID ‐19: Results from a close replication study in a Turkish sample International Journal of Psychology10.1002/ijop.1292158:5(456-464)Online publication date: 18-May-2023
    • (2022)The Avatar’s Gist: How to Transfer Affective Components From Dynamic Walking to Static Body PosturesFrontiers in Neuroscience10.3389/fnins.2022.84243316Online publication date: 15-Jun-2022
    • (2022)Towards Virtual Humans without Gender Stereotyped Visual FeaturesSIGGRAPH Asia 2022 Technical Communications10.1145/3550340.3564232(1-4)Online publication date: 6-Dec-2022
    • (2021)Perception of Motion Variations in Large-Scale Virtual Human CrowdsProceedings of the 14th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3487983.3488288(1-7)Online publication date: 10-Nov-2021
    • (2021)AppearanceThe Handbook on Socially Interactive Agents10.1145/3477322.3477327(105-146)Online publication date: 10-Sep-2021
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media