Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3311350.3347197acmconferencesArticle/Chapter ViewAbstractPublication Pageschi-playConference Proceedingsconference-collections
research-article

Recognizing Emotional Expression in Game Streams

Published: 17 October 2019 Publication History

Abstract

Gameplay is often an emotionally charged activity, in particular when streaming in front of a live audience. From a games user research perspective, it would be beneficial to automatically detect and recognize players' and streamers' emotional expression, as this data can be used for identifying gameplay highlights, computing emotion metrics or to select parts of the videos for further analysis, e.g., through assisted recall. We contribute the first automatic game stream emotion annotation system that combines neural network analysis of facial expressions, video transcript sentiment, voice emotion, and low-level audio features (pitch, loudness). Using human-annotated emotional expression data as the ground truth, we reach accuracies of up to 70.7%, on par with the inter-rater agreement of the human annotators. In detecting the 5 most intense events of each video, we reach a higher accuracy of 80.4%. Our system is particularly accurate in detecting clearly positive emotions like amusement and excitement, but more limited with subtle emotions like puzzlement.

Supplementary Material

ZIP File (fp9886aux.zip)
An excel file which contains the provided dataset.
MP4 File (p301-roohi.mp4)

References

[1]
2019a. Unravel for PC Reviews - Metacritic. (April 2019). https://www.metacritic.com/game/pc/unravel Retrieved April 5 2019.
[2]
2019b. Unravel Two for PC Reviews - Metacritic. (April 2019). https://www.metacritic.com/game/pc/unravel-two Retrieved April 5 2019.
[3]
Gibran Benitez-Garcia, Tomoaki Nakamura, and Masahide Kaneko. 2018. Multicultural facial expression recognition based on differences of western-caucasian and east-asian facial expressions of emotions. IEICE TRANSACTIONS on Information and Systems 101, 5 (2018), 1317--1324.
[4]
G. Bradski. 2000. The OpenCV Library. Dr. Dobb's Journal of Software Tools (2000).
[5]
Ralf C. Buckley. 2016. Aww: The Emotion of Perceiving Cuteness. Frontiers in Psychology 7 (2016), 1740.
[6]
Carlos Busso, Murtaza Bulut, Chi-Chun Lee, Abe Kazemzadeh, Emily Mower, Samuel Kim, Jeannette N Chang, Sungbok Lee, and Shrikanth S Narayanan. 2008. IEMOCAP: Interactive emotional dyadic motion capture database. Language resources and evaluation 42, 4 (2008), 335.
[7]
Houwei Cao, David G Cooper, Michael K Keutmann, Ruben C Gur, Ani Nenkova, and Ragini Verma. 2014. CREMA-D: Crowd-sourced emotional multimodal actors dataset. IEEE transactions on affective computing 5, 4 (2014), 377--390.
[8]
François Chollet and others. 2015. Keras. https://keras.io. (2015).
[9]
Benjamin Cowley, Marco Filetti, Kristian Lukander, Jari Torniainen, Andreas Henelius, Lauri Ahonen, Oswald Barral, Ilkka Kosunen, Teppo Valtonen, Minna Huotilainen, and others. 2016. The psychophysiology primer: a guide to methods and a broad review with a focus on human--computer interaction. Foundations and Trends® in Human--Computer Interaction 9, 3--4 (2016), 151--308.
[10]
Kate Dupuis and M Kathleen Pichora-Fuller. 2010. Toronto Emotional Speech Set (TESS). University of Toronto, Psychology Department.
[11]
Paul Ekman. 1971. Universals and cultural differences in facial expressions of emotion. In Nebraska symposium on motivation. University of Nebraska Press.
[12]
Paul Ekman. 2007. Emotions revealed: Recognizing faces and feelings to improve communication and emotional life. Macmillan.
[13]
Paul Ed Ekman and Erika L Rosenberg. 1997. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). (1997).
[14]
Coldwood Interactive. 2016. Unravel. Game [PC, PlayStation 4, Xbox One]. (February 2016). Electronic Arts.
[15]
Coldwood Interactive. 2018. Unravel Two. Game [PC, PlayStation 4, Xbox One]. (June 2018). Electronic Arts.
[16]
Thanapong Intharah and Gabriel J Brostow. 2018. DeepLogger: Extracting User Input Logs From 2D Gameplay Videos. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play. ACM, 221--230.
[17]
P Jackson and S Haq. 2014. Surrey audio-visual expressed emotion (savee) database. University of Surrey: Guildford, UK (2014).
[18]
Nicole Lazzaro. 2009. Why we play: affect and the fun of games. Human-computer interaction: Designing for diverse users and domains 155 (2009), 679--700.
[19]
Steven R Livingstone and Frank A Russo. 2018. The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PloS one 13, 5 (2018), e0196391.
[20]
Regan L Mandryk and Lennart E Nacke. 2016. Biometrics in Gaming and Entertainment Technologies. In Biometrics in a Data Driven World. Chapman and Hall/CRC, 215--248.
[21]
Brian McFee, Colin Raffel, Dawen Liang, Daniel PW Ellis, Matt McVicar, Eric Battenberg, and Oriol Nieto. 2015. librosa: Audio and music signal analysis in python. In Proceedings of the 14th python in science conference. 18--25.
[22]
Elisa D. Mekler, Julia Ayumi Bopp, Alexandre N. Tuch, and Klaus Opwis. 2014. A Systematic Review of Quantitative Studies on the Enjoyment of Digital Entertainment Games. In Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 927--936.
[23]
Pejman Mirza-Babaei, Lennart E. Nacke, John Gregory, Nick Collins, and Geraldine Fitzpatrick. 2013. How Does It Play Better?: Exploring User Testing and Biometric Storyboards in Games User Research. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 1499--1508.
[24]
Ali Mollahosseini, Behzad Hasani, and Mohammad H Mahoor. 2017. Affectnet: A database for facial expression, valence, and arousal computing in the wild. IEEE Transactions on Affective Computing (2017).
[25]
John T. Murray, Raquel Robinson, Michael Mateas, and Noah Wardrip-Fruin. 2018. Comparing Player Responses to Choice-Based Interactive Narratives Using Facial Expression Analysis. In Interactive Storytelling, Rebecca Rouse, Hartmut Koenitz, and Mads Haahr (Eds.). Springer International Publishing, Cham, 79--92.
[26]
Lennart E. Nacke, Pejman Mirza-Babaei, Katta Spiel, Zachary O. Toups, and Katherine Isbister. 2018. Games and Play SIG: Engaging Small Developer Communities. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (CHI EA '18). ACM, New York, NY, USA, Article SIG11, 4 pages.
[27]
Charles Ringer and Mihalis A. Nicolaou. 2018. Deep Unsupervised Multi-view Detection of Video Game Stream Highlights. In Proceedings of the 13th International Conference on the Foundations of Digital Games (FDG '18). ACM, New York, NY, USA, Article 15, 6 pages.
[28]
Raquel Robinson, Zachary Rubin, Elena Márquez Segura, and Katherine Isbister. 2017. All the Feels: Designing a Tool That Reveals Streamers' Biometrics to Spectators. In Proceedings of the 12th International Conference on the Foundations of Digital Games (FDG '17). ACM, New York, NY, USA, Article 36, 6 pages.
[29]
Shaghayegh Roohi, Jari Takatalo, J. Matias Kivikangas, and Perttu Hämäläinen. 2018. Neural Network Based Facial Expression Analysis of GameEvents: A Cautionary Tale. In Proceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY '18). ACM, New York, NY, USA, 429--437.
[30]
Noor Shaker and Mohammad Shaker. 2014. Towards Understanding the Nonverbal Signatures of Engagement in Super Mario Bros. In User Modeling, Adaptation, and Personalization, Vania Dimitrova, Tsvi Kuflik, David Chin, Francesco Ricci, Peter Dolog, and Geert-Jan Houben (Eds.). Springer International Publishing, Cham, 423--434.
[31]
Karen Simonyan and Andrew Zisserman. 2014. Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014).
[32]
Max Sjöblom and Juho Hamari. 2017. Why do people watch others play video games? An empirical study on the motivations of Twitch users. Computers in Human Behavior 75 (2017), 985--996.
[33]
Chek Tien Tan, Sander Bakkes, and Yusuf Pisan. 2014a. Correlation between facial expressions and the game experience questionnaire. In Proceedings of the Entertainment Computing-ICEC 2014: 13th International Conference, Vol. 8770. Springer, 229.
[34]
Chek Tien Tan, Tuck Wah Leong, and Songjia Shen. 2014b. Combining Think-aloud and Physiological Data to Understand Video Game Experiences. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 381--390.
[35]
Wouter Van den Hoogen, Karolien Poels, Wijnand IJsselsteijn, and Yvonne de Kort. 2012. Between challenge and defeat: Repeated player-death and game enjoyment. Media Psychology 15, 4 (2012), 443--459.
[36]
Tanja SH Wingenbach, Chris Ashwin, and Mark Brosnan. 2016. Validation of the Amsterdam Dynamic Facial Expression Set--Bath Intensity Variations (ADFES-BIV): A set of videos expressing low, intermediate, and high intensity emotions. PloS one 11, 1 (2016), e0147112.
[37]
Georgios N Yannakakis and Ana Paiva. 2014. Emotion in games. Handbook on affective computing (2014), 459--471.

Cited By

View all
  • (2022)An automated approach to estimate player experience in game events from psychophysiological dataMultimedia Tools and Applications10.1007/s11042-022-13845-582:13(19189-19220)Online publication date: 15-Oct-2022
  • (2021)To Be or Not to Be Stuck, or Is It a Continuum?Proceedings of the ACM on Human-Computer Interaction10.1145/34746565:CHI PLAY(1-35)Online publication date: 6-Oct-2021
  • (2021)Am I Playing Better Now?Proceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/34512694:1(1-17)Online publication date: 28-Apr-2021
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI PLAY '19: Proceedings of the Annual Symposium on Computer-Human Interaction in Play
October 2019
680 pages
ISBN:9781450366885
DOI:10.1145/3311350
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 October 2019

Permissions

Request permissions for this article.

Check for updates

Badges

  • Honorable Mention

Author Tags

  1. emotion
  2. facial expression
  3. games
  4. neural network
  5. player experience
  6. sentiment analysis

Qualifiers

  • Research-article

Funding Sources

Conference

CHI PLAY '19
Sponsor:

Acceptance Rates

CHI PLAY '19 Paper Acceptance Rate 51 of 181 submissions, 28%;
Overall Acceptance Rate 421 of 1,386 submissions, 30%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)62
  • Downloads (Last 6 weeks)7
Reflects downloads up to 14 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2022)An automated approach to estimate player experience in game events from psychophysiological dataMultimedia Tools and Applications10.1007/s11042-022-13845-582:13(19189-19220)Online publication date: 15-Oct-2022
  • (2021)To Be or Not to Be Stuck, or Is It a Continuum?Proceedings of the ACM on Human-Computer Interaction10.1145/34746565:CHI PLAY(1-35)Online publication date: 6-Oct-2021
  • (2021)Am I Playing Better Now?Proceedings of the ACM on Computer Graphics and Interactive Techniques10.1145/34512694:1(1-17)Online publication date: 28-Apr-2021
  • (2021)Finding epic moments in live content through deep learning on collective decisionsEPJ Data Science10.1140/epjds/s13688-021-00295-610:1Online publication date: 18-Aug-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media