Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1180995.1181031acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
Article

Spontaneous vs. posed facial behavior: automatic analysis of brow actions

Published: 02 November 2006 Publication History

Abstract

Past research on automatic facial expression analysis has focused mostly on the recognition of prototypic expressions of discrete emotions rather than on the analysis of dynamic changes over time, although the importance of temporal dynamics of facial expressions for interpretation of the observed facial behavior has been acknowledged for over 20 years. For instance, it has been shown that the temporal dynamics of spontaneous and volitional smiles are fundamentally different from each other. In this work, we argue that the same holds for the temporal dynamics of brow actions and show that velocity, duration, and order of occurrence of brow actions are highly relevant parameters for distinguishing posed from spontaneous brow actions. The proposed system for discrimination between volitional and spontaneous brow actions is based on automatic detection of Action Units (AUs) and their temporal segments (onset, apex, offset) produced by movements of the eyebrows. For each temporal segment of an activated AU, we compute a number of mid-level feature parameters including the maximal intensity, duration, and order of occurrence. We use Gentle Boost to select the most important of these parameters. The selected parameters are used further to train Relevance Vector Machines to determine per temporal segment of an activated AU whether the action was displayed spontaneously or volitionally. Finally, a probabilistic decision function determines the class (spontaneous or posed) for the entire brow action. When tested on 189 samples taken from three different sets of spontaneous and volitional facial data, we attain a 90.7% correct recognition rate.

References

[1]
Ambadar, Z., Schooler, J. and Cohn, J.F. Deciphering the enigmatic face: The importance of facial dynamics in interpreting subtle facial expressions, Psychological Science, 16, 5 (May 2005), 403--410.
[2]
Ambady, N. and Rosenthal, R. Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis. Psychological Bulletin, 111, 2 (Feb. 1992), 256--274.
[3]
Bartlett, M. S., Hager, J. C., Ekman, P. and Sejnowski, T. J. Measuring facial expressions by computer image analysis. Psychophysiology, 36, 2 (Mar. 1999), 253--263.
[4]
Bartlett, M. S., Littlewort, G., Frank, M. G., Lainscsek, C., Fasel, I. and Movellan, J. Fully automatic facial action recognition in spontaneous behavior. In Proc. Conf. Automatic Face & Gesture Recognition, 223--230, 2006.
[5]
Bartlett, M. S., Littlewort, G., Lainscsek, C., Fasel, I. and Movellan, J. Machine learning methods for fully automatic recognition of facial expressions and actions. In Proc. Int'l Conf. Systems, Man and Cybernetics, 592--597, 2004.
[6]
Bassili, J. N. Facial motion in the perception of faces and of emotional expression", J. Experimental Psychology, 4, 3 (Aug. 1978), 373--379.
[7]
Cohn, J. F., Reed, L. I., Ambadar, Z., Xiao, J. and Moriyama, T. Automatic analysis and recognition of brow actions in spontaneous facial behavior. In Proc. Int'l Conf. Systems, Man & Cybernetics, 1, 610--616, 2004.
[8]
Cohn, J. F. and Schmidt, K. L. The timing of facial motion in posed and spontaneous smiles. J. Wavelets, Multi-resolution & Information Processing, 2, 2 (June 2004), 121--132.
[9]
Cunningham, D. W., Kleiner, M., Wallraven, C. and Bülthoff, H. H. The components of conversational facial expressions. In Proc. ACM Int'l Symposium on Applied Perception in Graphics and Visualization, 143--149, 2004.
[10]
Duchenne de Bologne, G. B. Mechanisme de la Physionomie Humaine. Jules Renouard Libraire, Paris, France, 1862. (Translation: The Mechanism of Human Facial Expression. Cambridge University Press, New York, 1990).
[11]
Ekman, P. Darwin, deception, and facial expression. Annals New York Academy of sciences, 1000, (Dec. 2003), 205--221.
[12]
Ekman, P. and Friesen, W. F. Felt, false and miserable smiles. J. Nonverbal Behavior, 6, 4 (June 1982), 238--252.
[13]
Ekman, P., Friesen, W. V. and Hager, J. C. Facial Action Coding System. A Human Face, Salt Lake City, 2002.
[14]
Ekman, P., Hager, J. C. and Friesen, W. V. The symmetry of emotional and deliberate facial actions. Psychophysiology, 18, 2 (Mar. 1981), 101--106.
[15]
Ekman, P. and Rosenberg, E. L. Eds. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System. Oxford University Press, Oxford, UK, 2005.
[16]
El Kaliouby, R. and Robinson, P. Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures. In Proc. Int'l Conf. Computer Vision & Pattern Recognition, 3, 154, 2004.
[17]
Fasel, B., Monay, F. and Gatica-Perez, D. Latent semantic analysis of facial action codes for automatic facial expression recognition, In Proc. Int'l Workshop on Multimedia Information Retrieval, 181--188, 2004.
[18]
Flecha-Garcia, M. L. Facial gestures and communication: what induces raising eyebrow movements in Map Task dialogues. Proc. Theoretical & Applied Linguistics -- Postgraduate Conf., 2001.
[19]
Fogel, A., Hsu, H. C., Shapiro, A. F., Nelson-Goens, G. C. and Secrist, C. Effects of normal and perturbed social play on the duration and amplitude of different types of infant smiles. Developmental Psychology, 42, 3 (May 2006), 159--173.
[20]
Friedman, J., Hastie, T. and Tibshirani, R. Additive logistic regression: a statistical view of boosting. The Annals of Statistics, 28, 2 (Apr. 2000), 337--374.
[21]
Gralewski, L., Campbell, N. and Voak, I. P. Using a tensor framework for the analysis of facial dynamics. In Proc. Int'l Conf. Face & Gesture Recognition, 217--222, 2006.
[22]
Gu, H. and Ji, Q. An automated face reader for fatigue detection. In Proc. Int'l Conf. Face & Gesture Recognition, 111--116, 2004.
[23]
Hess, U. and Kleck, R. E. Differentiating emotion elicited and deliberate emotional facial expressions. European J. of Social Psychology, 20, 5 (Sep./Oct. 1990), 369--385.
[24]
Kanade, T., Cohn, J. F. and Tian, Y. Comprehensive database for facial expression analysis. In Proc. Int'l Conf. Automatic Face & Gesture Recognition, 46--53, 2000.
[25]
Kendon, A. A description of some human greetings. In Comparative Ecology and Behavior of Primates, Michael, R. and Crook, J., Eds. Academic Press, New York, USA, 1973, 591--668.
[26]
Littlewort, G., Bartlett, M. S., Fasel, I., Susskind, J. and Movellan, J. Dynamics of facial expression extracted automatically from video. Int'l J. Image & Vision Computing, 24, 6 (June 2006), 615--625.
[27]
Lucas, B. and Kanade, T. An interactive image registration technique with an application in stereo vision. In Proc. Int'l Joint Conf. on Artificial Intelligence, 674--679, 1981.
[28]
Messinger, D. S., Fogel, A. and Dickson, K. L. A dynamic systems approach to infant facial action. The psychology of facial expression. Russell, J. A. and Fernandez-Dols, J. M., Eds. Cambridge University Press, Cambridge, UK, 1997, 205--226.
[29]
Pantic, M. and Patras, I. Dynamics of facial expressions -- recognition of facial actions and their temporal segments from face profile image sequences. IEEE Trans. Systems, Man, and Cybernetics, Part B, 36, 2 (Apr. 2006), 433--449.
[30]
Pantic, M., Pentland, A., Nijholt, A. and Huang, T. S. Front-end of human computing: Machine analysis of human behaviour. In Proc. Int'l Conf. Multimodal Interfaces, 2006.
[31]
Pantic, M. and Rothkrantz, L. J. M. Toward an affect-sensitive multimodal human-computer interaction. Proceedings of the IEEE, 91, 9 (Sep. 2003), 1370--1390.
[32]
Pantic, M. and Rothkrantz, L. J. M. Case-based reasoning for user-profiled recognition of emotions from face images. In Proc. Int'l Conf. Multimedia and Expo, 2004, 391--394.
[33]
Pantic, M., Valstar, M. F., Rademaker, R. and Maat, L. Web-based database for facial expression analysis. In Proc. Int'l Conf. Multimedia and Expo, 2005, 317--321. (www.mmifacedb.com)
[34]
Patras, I. and Pantic, M. Particle filtering with factorized likelihoods for tracking facial features. In Proc. Int'l Conf. Automatic Face & Gesture Recognition, 97--102, 2004.
[35]
Patras, I. and Pantic, M. Tracking deformable motion. In Proc. Int'l Conf. Systems, Man and Cybernetics, 1066--1071, 2005.
[36]
Rinn, W. E. The neuropsychology of facial expressions: A review of the neurological and psychological mechanisms for producing facial expressions. Psychological Bulletin, 95, 1 (Jan. 1984), 52--77.
[37]
Rosenberg, E. L., Ekman, P. and Blumenthal, J. A. Facial expression and the affective component of cynical hostility in male coronary heart disease patients. J. on Health Psychology, 17, 4 (July 1998), 376--380.
[38]
Schmidt, K. L., Ambadar, Z., Cohn, J. F. and Reed, I. Movement differences between deliberate and spontaneous facial expressions: Zygomaticus Major action in smiling. J. Nonverbal Behavior, 30, 1 (Mar. 2006), 37--52.
[39]
Tian, Y. L., Kanade, T. and Cohn, J. F. Facial Expression Analysis. In Handbook of Face Recognition. Li, S. Z. and Jain, A. K., Eds. Springer, New York, 2005.
[40]
Tipping, M. E. The relevance vector machines. In Proc. Int'l Conf. Adv. Neural Inf. Processing Systems, 332--338, 2000.
[41]
Valstar, M. F. and Pantic, M. Fully automatic facial action unit detection and temporal analysis. In Proc. Int'l Conf. Computer Vision and Pattern Recognition, 3, 149, 2006.
[42]
Veldhuizen, I. J. T., Gaillard, A. W. K. and De Vries, J. The influence of mental fatigue on facial EMG activity during a simulated workday. Biological Psychology, 63, 1 (Apr. 2003), 59--78.
[43]
Vukadinovic, D. and Pantic, M. Fully automatic facial feature point detection using Gabor feature based boosted classifiers. In Proc. Int'l Conf. Systems Man and Cybernetics, 1692--1698, 2005.
[44]
Williams, A. C. de C. Facial expression of pain: An evolutionary account. Behavioral and Brain Sciences, 25, 4 (Aug. 2002), 439--488.
[45]
Xiao, J., Moriyama, T., Kanade, T. and Cohn, J. F. Robust full-motion recovery of head by dynamic templates and re-registration techniques. Int'l J. Imaging Systems and Technology, 13, 1 (Sep. 2003), 85--94.
[46]
Zhang, Y. and Ji, Q. Active and dynamic information fusion for facial expression understanding from image sequence. IEEE Trans. Pattern Analysis & Machine Intelligence, 27, 5 (May 2005), 699--714.

Cited By

View all
  • (2024) Three‐dimensional video recordings: Accuracy, reliability, clinical and research guidelines – Reliability assessment of a 4D camera Orthodontics & Craniofacial Research10.1111/ocr.12808Online publication date: 15-May-2024
  • (2024)Smile analysis in dentistry and orthodontics – a reviewJournal of the Royal Society of New Zealand10.1080/03036758.2024.231622655:1(192-205)Online publication date: 19-Feb-2024
  • (2024)Facial Expressions Based on the Types of Conversation ContentsThe Review of Socionetwork Strategies10.1007/s12626-024-00177-z18:2(449-489)Online publication date: 16-Nov-2024
  • Show More Cited By

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
ICMI '06: Proceedings of the 8th international conference on Multimodal interfaces
November 2006
404 pages
ISBN:159593541X
DOI:10.1145/1180995
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2006

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. automatic facial expression analysis
  2. temporal dynamics

Qualifiers

  • Article

Conference

ICMI06
Sponsor:

Acceptance Rates

Overall Acceptance Rate 453 of 1,080 submissions, 42%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)1
Reflects downloads up to 11 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024) Three‐dimensional video recordings: Accuracy, reliability, clinical and research guidelines – Reliability assessment of a 4D camera Orthodontics & Craniofacial Research10.1111/ocr.12808Online publication date: 15-May-2024
  • (2024)Smile analysis in dentistry and orthodontics – a reviewJournal of the Royal Society of New Zealand10.1080/03036758.2024.231622655:1(192-205)Online publication date: 19-Feb-2024
  • (2024)Facial Expressions Based on the Types of Conversation ContentsThe Review of Socionetwork Strategies10.1007/s12626-024-00177-z18:2(449-489)Online publication date: 16-Nov-2024
  • (2022)“I Didn’t Know I Looked Angry”: Characterizing Observed Emotion and Reported Affect at WorkProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517453(1-18)Online publication date: 29-Apr-2022
  • (2022)Continual Learning for Adaptive Affective Human-Robot Interaction2022 10th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)10.1109/ACIIW57231.2022.10086015(1-5)Online publication date: 18-Oct-2022
  • (2022)Detection & recognition of veiled and unveiled human face on the basis of eyes using transfer learningMultimedia Tools and Applications10.1007/s11042-022-13402-082:3(4257-4287)Online publication date: 25-Jul-2022
  • (2022)Subject-dependent selection of geometrical features for spontaneous emotion recognitionMultimedia Tools and Applications10.1007/s11042-022-13380-382:2(2635-2661)Online publication date: 1-Jul-2022
  • (2021)Development and validation of the Interoceptive States Static Images (ISSI) databaseBehavior Research Methods10.3758/s13428-021-01706-254:4(1744-1765)Online publication date: 14-Oct-2021
  • (2021)Detection of Genuine and Posed Facial Expressions of Emotion: Databases and MethodsFrontiers in Psychology10.3389/fpsyg.2020.58028711Online publication date: 15-Jan-2021
  • (2021)Dynamics of facial actions for assessing smile genuinenessPLOS ONE10.1371/journal.pone.024464716:1(e0244647)Online publication date: 5-Jan-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media