Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3290607.3313062acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
abstract

Perception of Emotion in Body Expressions from Gaze Behavior

Published: 02 May 2019 Publication History

Editorial Notes

A corrigendum was issued for this paper on November 25, 2019. You can download the corrigendum from the source materials section of this citation page.

Abstract

Developing affectively aware technologies is a growing industry. To build them effectively, it is important to understand the features involved in discriminating between emotions. While many technologies focus on facial expressions, studies have highlighted the influence of body expressions over other modalities for perceiving some emotions. Eye tracking studies have evaluated the combination of face and body to investigate the influence of each modality, however, few to none have investigated the perception of emotion from body expressions alone. This exploratory study aimed to evaluate the discriminative importance of dynamic body features for decoding emotion. Eye tracking was used to monitor participants' eye gaze behavior while viewing clips of non-acted body movements to which they associated an emotion. Preliminary results indicate that the two primary regions attended to most often and longest were the torso and the arms. Further analysis is ongoing, however initial results independently confirm prior studies without eye tracking.

Supplementary Material

LBW2416-corrigendum.pdf (lbw2416.pdf-corrigendum.pdf)
Corrigendum to "Perception of Emotion in Body Expressions from Gaze Behavior," by Kleinsmith and Semsar, CHI EA '19 Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems
MP4 File (lbw2416p.mp4)
Preview video

References

[1]
Joseph Arizpe, Dwight J Kravitz, Galit Yovel, and Chris I Baker. 2012. Start position strongly influences fixation patterns during face processing: Difficulties with eye movements as a measure of information use. PloS one 7, 2 (2012), e31106.
[2]
Thomas Armstrong and Bunmi O Olatunji. 2012. Eye tracking of attention in the affective disorders: A meta-analytic review and synthesis. Clinical Psychology Review 32, 8 (2012), 704--723.
[3]
Rachel L Bannerman, Maarten Milders, Beatrice De Gelder, and Arash Sahraie. 2009. Orienting to threat: faster localization of fearful facial expressions and body postures revealed by saccadic eye movements. Proceedings of the Royal Society of London B: Biological Sciences (2009), rspb--2008.
[4]
Beatrice De Gelder. 2006. Towards the neurobiology of emotional body language. Nature Reviews Neurosci 7, 3 (2006), 242.
[5]
Robert F DeVellis. 2016. Scale development: Theory and applications. Vol. 26. Sage publications.
[6]
Hedwig Eisenbarth and Georg W Alpers. 2011. Happy mouth and sad eyes: scanning emotional facial expressions. Emotion 11, 4 (2011), 860.
[7]
Michelle Karg, Ali-Akbar Samadani, Rob Gorbet, Kolja Kühnlenz, Jesse Hoey, and Dana Kulic. 2013. Body movements for affective expression: A survey of automatic recognition and generation. IEEE Trans on Affective Comp 4, 4 (2013), 341--359.
[8]
Andrea Kleinsmith and Nadia Bianchi-Berthouze. 2013. Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing 4, 1 (2013), 15--33.
[9]
Andrea Kleinsmith, Nadia Bianchi-Berthouze, and Anthony Steed. 2011. Automatic recognition of non-acted affective postures. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 41, 4 (2011), 1027--1038.
[10]
Mariska Kret, Jeroen Stekelenburg, Karin Roelofs, and Beatrice De Gelder. 2013. Perception of face and body expressions using electromyography, pupillometry and gaze measures. Frontiers in Psychology 4 (2013), 28.
[11]
Mariska Esther Kret, Karin Roelofs, Jeroen Stekelenburg, and Beatrice de Gelder. 2013. Emotional signals from faces, bodies and scenes influence observers' face expressions, fixations and pupil-size. Frontiers in Hum Neurosci 7 (2013), 810.
[12]
Daniel McDuff, Abdelrahman Mahmoud, Mohammad Mavadati, May Amr, Jay Turcot, and Rana el Kaliouby. 2016. AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, 3723--3726.
[13]
Hanneke KM Meeren, Corné CRJ van Heijnsbergen, and Beatrice de Gelder. 2005. Rapid perceptual integration of facial expression and emotional body language. Proceedings of the National Academy of Sciences 102, 45 (2005), 16518--16523.
[14]
Matthew F Peterson and Miguel P Eckstein. 2012. Looking just below the eyes is optimal across face recognition tasks. Proceedings of the National Academy of Sciences 109, 48 (2012), E3314--E3323.
[15]
Evangelos Sariyanidi, Hatice Gunes, and Andrea Cavallaro. 2015. Automatic analysis of facial affect: A survey of registration, representation, and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 37, 6 (2015), 1113--1133.
[16]
Nicola S Schutte, John M Malouff, Lena E Hall, Donald J Haggerty, Joan T Cooper, Charles J Golden, and Liane Dornheim. 1998. Development and validation of a measure of emotional intelligence. Pers and Ind Diffs 25, 2 (1998), 167--177.
[17]
Benjamin W Tatler. 2007. The central fixation bias in scene viewing: Selecting an optimal viewing position independently of motor biases and image feature distributions. Journal of vision 7, 14 (2007), 4--4.
[18]
Avinash R Vaidya, Chenshuo Jin, and Lesley K Fellows. 2014. Eye spy: The predictive value of fixation patterns in detecting subtle and extreme emotions from faces. Cognition 133, 2 (2014), 443--456.
[19]
Jan Van den Stock, Ruthger Righart, and Beatrice De Gelder. 2007. Body expressions influence recognition of emotions in the face and voice. Emotion 7, 3 (2007), 487.

Cited By

View all
  • (2024)Exploring the Untapped Potential of Neuromarketing in Online Learning: Implications and Challenges for the Higher Education Sector in EuropeBehavioral Sciences10.3390/bs1402008014:2(80)Online publication date: 23-Jan-2024
  • (2022)Emotion recognition method using millimetre wave radar based on deep learningIET Radar, Sonar & Navigation10.1049/rsn2.1229716:11(1796-1808)Online publication date: 22-Jul-2022
  • (2020)Emotional Map : Building a Data Tool for Geolocation-related Product DesignInternational Symposium on Affective Science and Engineering10.5057/isase.2020-C000034ISASE2020(1-4)Online publication date: 2020
  • Show More Cited By

Index Terms

  1. Perception of Emotion in Body Expressions from Gaze Behavior

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems
    May 2019
    3673 pages
    ISBN:9781450359719
    DOI:10.1145/3290607
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 02 May 2019

    Check for updates

    Author Tags

    1. body expressions
    2. emotion
    3. eye tracking

    Qualifiers

    • Abstract

    Conference

    CHI '19
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Upcoming Conference

    CHI '25
    CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)22
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 19 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploring the Untapped Potential of Neuromarketing in Online Learning: Implications and Challenges for the Higher Education Sector in EuropeBehavioral Sciences10.3390/bs1402008014:2(80)Online publication date: 23-Jan-2024
    • (2022)Emotion recognition method using millimetre wave radar based on deep learningIET Radar, Sonar & Navigation10.1049/rsn2.1229716:11(1796-1808)Online publication date: 22-Jul-2022
    • (2020)Emotional Map : Building a Data Tool for Geolocation-related Product DesignInternational Symposium on Affective Science and Engineering10.5057/isase.2020-C000034ISASE2020(1-4)Online publication date: 2020
    • (2020)FrownOnError: Interrupting Responses from Smart Speakers by Facial ExpressionsProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376810(1-14)Online publication date: 21-Apr-2020

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media