Nothing Special   »   [go: up one dir, main page]

skip to main content
research-article

EmoStim: A Database of Emotional Film Clips With Discrete and Componential Assessment

Published: 31 October 2023 Publication History

Abstract

Emotion elicitation using emotional film clips is one of the most common and ecologically valid methods in Affective Computing. However, selecting and validating appropriate materials that evoke a range of emotions is challenging. Here, we present EmoStim: A Database of Emotional Film Clips as a film library with rich and varied content. EmoStim is designed for researchers interested in studying emotions in relation to either discrete or componential models of emotion. To create the database, 139 film clips were selected from literature and then annotated by 638 participants through the CrowdFlower platform. We selected 99 film clips based on the distribution of subjective ratings that effectively distinguished between emotions defined by the discrete model. We show that the selected film clips reliably induce a range of specific emotions according to the discrete model. Further, we describe relationships between emotions, emotion organization in the componential space, and underlying dimensions representing emotional experience. The EmoStim database and participant annotations are freely available for research purposes. The database can be used to enrich our understanding of emotions further and serve as a guide to select or creating additional materials.

References

[1]
R. W. Picard, Affective Computing. Cambridge, MA, USA: MIT Press, 2000.
[2]
E. Harmon-Jones, C. Harmon-Jones, and E. Summerell, “On the importance of both dimensional and discrete models of emotion,” Behav. Sci., vol. 7, no. 4, 2017, Art. no.
[3]
J. J. R. Fontaine, Dimensional, Basic Emotion, and Componential Approaches to Meaning in Psychological Emotion Research1. Oxford, U.K.: Oxford Univ. Press, 2013.
[4]
J. Yih, L. D. Kirby, and C. A. Smith, “Profiles of appraisal, motivation, and coping for positive emotions,” Cogn. Emotion, vol. 34, no. 3, pp. 481–497, 2020.
[5]
R. Jenke and A. Peer, “A cognitive architecture for modeling emotion dynamics: Intensity estimation from physiological signals,” Cogn. Syst. Res., vol. 49, pp. 128–141, 2018.
[6]
K. R. Scherer, “The dynamic architecture of emotion: Evidence for the component process model,” Cogn. Emotion, vol. 23, no. 7, pp. 1307–1351, 2009.
[7]
A. E. Skerry and R. Saxe, “Neural representations of emotion are organized around abstract event features,” Curr. Biol., vol. 25, no. 15, pp. 1945–1954, 2015.
[8]
L. F. Barrett, How Emotions are Made: The Secret Life of the Brain. London, U.K.: Pan Macmillan, 2017.
[9]
L. F. Barrett, “Are emotions natural kinds?,” Perspectives Psychol. Sci., vol. 1, no. 1, pp. 28–58, 2006.
[10]
K. A. Lindquist, T. D. Wager, H. Kober, E. Bliss-Moreau, and L. F. Barrett, “The brain basis of emotion: A meta-analytic review,” Behav. Brain Sci., vol. 35, no. 3, pp. 121–143, 2012.
[11]
T. D. Wager, J. Kang, T. D. Johnson, T. E. Nichols, A. B. Satpute, and L. F. Barrett, “A Bayesian model of category-specific emotional brain responses,” PLoS Comput. Biol., vol. 11, no. 4, 2015, Art. no.
[12]
J. A. M. Correa, M. K. Abadi, N. Sebe, and I. Patras, “AMIGOS: A dataset for affect, personality and mood research on individuals and groups,” IEEE Trans. Affective Comput., vol. 12, no. 2, pp. 479–493, Second Quarter, 2021.
[13]
S. Koelstra et al., “DEAP: A database for emotion analysis ;using physiological signals,” IEEE Trans. Affective Comput., vol. 3, no. 1, pp. 18–31, First Quarter 2012.
[14]
E. S. Dan-Glauser and K. R. Scherer, “The Geneva affective picture database (GAPED): A new 730-picture database focusing on valence and normative significance,” Behav. Res. Methods, vol. 43, no. 2, 2011, Art. no.
[15]
P. J. Lang et al., “International affective picture system (IAPS): Technical manual and affective ratings,” NIMH Center Study Emotion Attention, vol. 1, no. 39–58, 1997, Art. no.
[16]
F. Alessia and P.-J. Lee, “Emotions elicited by indoor sound sources in wooden residential buidings,” in Proc. INTER-NOISE NOISE-CON Congr. Conf., 2020, pp. 3624–3634.
[17]
W. Yang et al., “Affective auditory stimulus database: An expanded version of the international affective digitized sounds (IADS-E),” Behav. Res. Methods, vol. 50, no. 4, pp. 1415–1429, 2018.
[18]
G. Mohammadi and P. Vuilleumier, “A multi-componential approach to emotion recognition and the effect of personality,” IEEE Trans. Affective Comput., vol. 13, no. 3, pp. 1127–1139, Third Quarter 2022.
[19]
A. Schaefer, F. Nils, X. Sanchez, and P. Philippot, “Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers,” Cogn. Emotion, vol. 24, no. 7, pp. 1153–1172, 2010.
[20]
P. J. Bota, C. Wang, A. L. N. Fred, and H. P. D. Silva, “A review, current challenges, and future possibilities on emotion recognition using machine learning and physiological signals,” IEEE Access, vol. 7, pp. 140 990–141 020, 2019.
[21]
A. C. Samson, S. D. Kreibig, B. Soderstrom, A. A. Wade, and J. J. Gross, “Eliciting positive, negative and mixed emotional states: A film library for affective scientists,” Cogn. Emotion, vol. 30, no. 5, pp. 827–56, 2016.
[22]
S. Katsigiannis and N. Ramzan, “DREAMER: A database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices,” IEEE J. Biomed. Health Inform., vol. 22, no. 1, pp. 98–107, Jan. 2018.
[23]
R. Subramanian, J. Wache, M. K. Abadi, R. L. Vieriu, S. Winkler, and N. Sebe, “ASCERTAIN: Emotion and personality recognition using commercial sensors,” IEEE Trans. Affective Comput., vol. 9, no. 2, pp. 147–160, Second Quarter, 2018.
[24]
R. Somarathna, T. Bednarz, and G. Mohammadi, “Virtual reality for emotion elicitation - A review,” IEEE Trans. Affective Comput., to be published.
[25]
P. Ekman and W. V. Friesen, “Constants across cultures in the face and emotion,” J. Pers. Social Psychol., vol. 17, no. 2, 1971, Art. no.
[26]
N. Frijda, N. Fridja, A. Manstead, and K. Oatley, The Emotions. Cambridge, U.K.: Cambridge Univ. Press, 1986.
[27]
C. E. Izard, “Basic emotions, natural kinds, emotion schemas, and a new paradigm,” Perspectives Psychol. Sci., vol. 2, no. 3, pp. 260–280, 2007.
[28]
C. E. Izard, D. Z. Libero, P. Putnam, and O. M. Haynes, “Stability of emotion experiences and their relations to traits of personality,” J. Pers. Social Psychol., vol. 64, no. 5, 1993, Art. no.
[29]
G. J. McHugo, C. A. Smith, and J. T. Lanzetta, “The structure of self-reports of emotional responses to film segments,” Motivation Emotion, vol. 6, no. 4, pp. 365–385, 1982.
[30]
K. R. Scherer, V. Shuman, J. J. R. Fontaine, and C. Soriano, The GRID Meets the Wheel: Assessing Emotional Feeling via Self-Report1. Oxford, U.K.: Oxford Univ. Press, 2013.
[31]
P. A. Kragel and K. S. LaBar, “Multivariate pattern classification reveals autonomic and experiential representations of discrete emotions,” Emotion, vol. 13, no. 4, 2013, Art. no.
[32]
G. Mohammadi, D. Van De Ville, and P. Vuilleumier, “Brain networks subserving functional core processes of emotions identified with componential modeling,” Cereb. Cortex, vol. 33, no. 12, pp. 7993–8010, 2023.
[33]
H. Kober, L. F. Barrett, J. Joseph, E. Bliss-Moreau, K. Lindquist, and T. D. Wager, “Functional grouping and cortical–subcortical interactions in emotion: A meta-analysis of neuroimaging studies,” Neuroimage, vol. 42, no. 2, pp. 998–1031, 2008.
[34]
J. Leitão, B. Meuleman, D. Van De Ville, and P. Vuilleumier, “Computational imaging during video game playing shows dynamic synchronization of cortical and subcortical networks of emotions,” PLoS Biol., vol. 18, no. 11, 2020, Art. no.
[35]
S. D. Kreibig, “Autonomic nervous system activity in emotion: A review,” Biol. Psychol., vol. 84, no. 3, pp. 394–421, 2010.
[36]
Y. Baveye, E. Dellandrea, C. Chamaret, and L. Chen, “LIRIS-ACCEDE: A video database for affective content analysis,” IEEE Trans. Affective Comput., vol. 6, no. 1, pp. 43–55, First Quarter, 2015.
[37]
M. K. Abadi, R. Subramanian, S. M. Kia, P. Avesani, I. Patras, and N. Sebe, “DECAF: MEG-based multimodal database for decoding affective physiological responses,” IEEE Trans. Affective Comput., vol. 6, no. 3, pp. 209–222, Third Quarter, 2015.
[38]
C. A. Gabert-Quillen, E. E. Bartolini, B. T. Abravanel, and C. A. Sanislow, “Ratings for emotion film clips,” Behav. Res. Methods, vol. 47, no. 3, pp. 773–787, 2015.
[39]
J. J. Gross and R. W. Levenson, “Emotion elicitation using films,” Cogn. Emotion, vol. 9, no. 1, pp. 87–108, 1995.
[40]
M. Soleymani, J. Lichtenauer, T. Pun, and M. Pantic, “A multimodal database for affect recognition and implicit tagging,” IEEE Trans. Affective Comput., vol. 3, no. 1, pp. 42–55, First Quarter, 2012.
[41]
M. Siemer, I. Mauss, and J. J. Gross, “Same situation–different emotions: How appraisals shape our emotions,” Emotion, vol. 7, no. 3, pp. 592–600, 2007.
[42]
M. Menétrey, “Assessing the component process model of emotion using multivariate pattern classification analyses,” Master's thesis, Univ. Geneva, 2019.
[43]
G. Mohammadi, K. Lin, and P. Vuilleumier, “Towards understanding emotional experience in a componential framework,” in Proc. 8th Int. Conf. Affect. Comput. Intell. Interact., 2019, pp. 123–129.
[44]
M. Soleymani, G. Chanel, J. J. Kierkels, and T. Pun, “Affective ranking of movie scenes using physiological signals and content analysis,” in Proc. 2nd ACM Workshop Multimedia Semantics, New York, NY, USA, 2008, pp. 32–39.
[45]
K. R. Scherer, J. R. F. Fontaine, and C. Soriano, CoreGRID and MiniGRID: Development and Validation of Two Short Versions of the GRID Instrument1. Oxford, U.K.: Oxford Univ. Press, 2013.
[46]
B. Rammstedt and O. P. John, “Measuring personality in one minute or less: A 10-item short version of the big five inventory in English and German,” J. Res. Pers., vol. 41, no. 1, pp. 203–212, 2007.
[47]
R. Somarathna, T. Bednarz, and G. Mohammadi, “Multi-componential analysis of emotions using virtual reality,” in Proc. 27th ACM Symp. Virtual Reality Softw. Technol., 2021, Art. no.
[48]
S. A. Chesney and N. S. Gordon, “Profiles of emotion regulation: Understanding regulatory patterns and the implications for posttraumatic stress,” Cogn. Emotion, vol. 31, no. 3, pp. 598–606, 2017.
[49]
B. Meuleman and K. R. Scherer, “Nonlinear appraisal modeling: An application of machine learning to the study of emotion production,” IEEE Trans. Affective Comput., vol. 4, no. 4, pp. 398–411, Dec. 2013.
[50]
R. Somarathna, T. Bednarz, and G. Mohammadi, “An exploratory analysis of interactive VR-based framework for multi-componential analysis of emotion,” in Proc. IEEE Int. Conf. Pervasive Comput. Commun. Workshops Affiliated Events, 2022, pp. 353–358.
[51]
C. E. Izard, “Emotion theory and research: Highlights, unanswered questions, and emerging issues,” Annu. Rev. Psychol., vol. 60, pp. 1–25, 2009.
[52]
J. R. Fontaine, K. R. Scherer, E. B. Roesch, and P. C. Ellsworth, “The world of emotions is not two-dimensional,” Psychol. Sci., vol. 18, no. 12, pp. 1050–1057, 2007.
[53]
M. J. Kim, A. M. Mattek, R. H. Bennett, K. M. Solomon, J. Shin, and P. J. Whalen, “Human amygdala tracks a feature-based valence signal embedded within the facial expression of surprise,” J. Neurosci., vol. 37, no. 39, pp. 9510–9518, 2017.
[54]
P. Ekman and W. V. Friesen, “Unmasking the Face: A guide to recognizing emotions from facial clues,” vol. 10, ISHK, Los Altos, CA, USA, 2003.
[55]
U. Beermann et al., “Dimensions and clusters of aesthetic emotions: A semantic profile analysis,” Front. Psychol., vol. 12, no. 1949, 2021, Art. no.
[56]
K. Scherer, “What are emotions? And how can they be measured?,” Social Sci. Inf., vol. 44, pp. 695–792, 2005.
[57]
J. M. Kory and S. K. D'Mello, “Affect elicitation for affective computing,” in The Oxford Handbook of Affective Computing. London, U.K.: Oxford Univ. Press, 2014.
[58]
J. Marín-Morales et al., “Affective computing in virtual reality: Emotion recognition from brain and heartbeat dynamics using wearable sensors,” Sci. Rep., vol. 8, no. 1, 2018, Art. no.
[59]
B. Meuleman and D. Rudrauf, “Induction and profiling of strong multi-componential emotions in virtual reality,” IEEE Trans. Affective Comput., vol. 12, no. 1, pp. 189–202, First Quarter, 2021.
[60]
A. Kim, M. Chang, Y. Choi, S. Jeon, and K. Lee, “The effect of immersion on emotional responses to film viewing in a virtual environment,” in Proc. IEEE Conf. Virtual Reality 3D User Interfaces, 2018, pp. 601–602.
[61]
B. J. Li, J. N. Bailenson, A. Pines, W. J. Greenleaf, and L. M. Williams, “A public database of immersive VR videos with corresponding ratings of arousal, valence, and correlations between head movements and self report measures,” Front. Psychol., vol. 8, 2017, Art. no.
[62]
J. Nam et al., “A new terrain in HCI: Emotion recognition interface using biometric data for an immersive VR experience,” 2019,.
[63]
Y. Luo, J. Ye, R. B. Adams, J. Li, M. G. Newman, and J. Z. Wang, “ARBEE: Towards automated recognition of bodily expression of emotion in the wild,” Int. J. Comput. Vis., vol. 128, pp. 1–25, 2020.
[64]
H. Kim et al., “Development and validation of image stimuli for emotion elicitation (ISEE): A novel affective pictorial system with test-retest repeatability,” Psychiatry Res., vol. 261, pp. 414–420, 2018.
[65]
S. Lim and S. Jahng, “Determining the number of factors using parallel analysis and its recent variants,” Psychol. Methods, vol. 24, no. 4, 2019, Art. no.
[66]
A. Achim, “Testing the number of required dimensions in exploratory factor analysis,” Quantitative Methods Psychol., vol. 13, no. 1, pp. 64–74, 2017.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing  Volume 15, Issue 3
July-Sept. 2024
1087 pages

Publisher

IEEE Computer Society Press

Washington, DC, United States

Publication History

Published: 31 October 2023

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 21 Dec 2024

Other Metrics

Citations

View Options

View options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media