Abstract
Human emotion detection is of substantial importance in a variety of pervasive applications in assistive environments. Because facial expressions provide a key mechanism for understanding and conveying emotion, automatic emotion detection through facial expression recognition has attracted increased attention in both scientific research and practical applications in recent years. Traditional facial expression recognition methods normally use only one type of facial expression data, either static data extracted from one single face image or motion-dependent data obtained from dynamic face image sequences, but seldom employ both. This work proposes to place the emotion detection problem under the framework of Discriminant Laplacian Embedding (DLE) to integrate these two types of facial expression data in a shared subspace, such that the advantages of both of them are exploited. Due to the reinforcement between the two types of facial features, the new data representation is more discriminative and easier to classify. Encouraging experimental results in empirical studies demonstrate the practical usage of the proposed DLE method for emotion detection.
Similar content being viewed by others
References
Belkin, M., Niyogi, P.: Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. NIPS’02 (2002)
Chellappa, R., Wilson, C., Sirohey, S., et al.: Human and machine recognition of faces: a survey. Proc. IEEE 83(5), 705–740 (1995)
Chung, F.: Spectral Graph Theory. American Mathematical Society, Providence (1997)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines: and Other Kernel-Based Learning Methods. Cambridge University Press, Cambridge (2000)
Dailey, M., Joyce, C., Lyons, M., Kamachi, M., Ishi, H., Gyoba, J., Cottrell, G.: Evidence and a computational explanation of cultural differences in facial expression recognition. Emotion 10(6), 874 (2010)
Doukas, C., Maglogiannis, I.: Emergency fall incidents detection in assisted living environments utilizing motion, sound, and visual perceptual components. IEEE Trans. Inf. Technol. Biomed. 15(2), 277–289 (2011)
Duan, K., Keerthi, S.: Which is the best multiclass SVM method? An empirical study. Multiple Classif. Syst. 3541, 278–285 (2005)
Ekman, P., Friesen, W.: Facial Action Coding System: A technique for the measurement of facial movement. Consulting Psychologists Press, Palo Alto, California (1978)
Esau, N., Wetzel, E., Kleinjohann, L., Keinjohann, B.: Real-time facial expression recognition using a fuzzy emotion model. In: IEEE International Fuzzy Systems Conference FUZZ-IEEE 2007, pp. 1–6 (2007)
Fellenz, W., Taylor, J., Cowie, R., Douglas-Cowie, E., Piat, F., Kollias, S., Orovas, C., Apolloni, B.: On emotion recognition of faces and of speech using neuralnetworks, fuzzy logic and the ASSESS system. In: Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks, 2000 (IJCNN 2000), vol. 2 (2000)
Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press, London (1990)
Graham, D., Allinson, N.: Face recognition: from theory to applications. NATO ASI Series F. Comput. Syst. Sci. 163, 446–456 (1998)
Hall, K.: An r-dimensional quadratic placement algorithm. Manag. Sci. 17(3), 219–229 (1970)
He, X., Niyogi, P.: Locality preserving projections. In: NIPS’03 (2003)
Hsu, C., Lin, C.: A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 13(2), 415–425 (2002)
Jolliffe, I.: Principal Component Analysis. Springer, Berlin (2002)
Kanade, T., Cohn, J., Tian, Y.: Comprehensive database for facial expression analysis. In: Fourth IEEE International Conference on Automatic Face and Gesture Recognition, 2000. Proceedings, pp. 46–53. IEEE (2000)
Kimura, S., Yachida, M.: Facial expression recognition and its degree estimation. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 295–300 (1997)
Kressel, U.: Pairwise classification and support vector machines. Adv. Kernel Methods Support Vector Learn, MIT Press, Cambridge, pp 255–268 (1999)
Littlewort, G., Whitehill, J., Wu, T., Fasel, I., Frank, M., Movellan, J., Bartlett, M.: The computer expression recognition toolbox (cert). In: 2011 IEEE International Conference on Automatic Face & Gesture Recognition and Workshops (FG 2011), pp. 298–305. IEEE (2011)
Maglogiannis, I., Betke, M., Pantziou, G., Makedon, F.: Assistive environments for the disabled and the senior citizens: theme issue of petra 2010 and 2011 conferences. Pers. Ubiquit. Comput. 1–3, November (2012)
Maglogiannis, I., Makedon, F., Pantziou, G., Baillie, L.: Pervasive technologies for assistive environments: special issue of petra 2008 conference. Pers. Ubiquit. Comput. 14(6), 469–471 (2010)
Maglogiannis, I., Vouyioukas, D., Aggelopoulos, C.: Face detection and recognition of natural human emotion using Markov random fields. Pers. Ubiquit. Comput. 13(1), 95–101 (2009)
Metallinou, A.S.L.S.N.: Audio-visual emotion recognition using gaussian mixture models for face and voice. In: Proceedings of the Tenth IEEE Symposium on Multimedia (2008)
Michel, P.: Support Vector Machines in Automated Emotion Classification. Churchill College, London (2003)
Rizon, M., Karthigayan, M., Yaacob, S., Nagarajan, R.: Japanese face emotions classification using lip features. In: Geometric Modeling and Imaging, 2007 (GMAI’07), pp. 140–144 (2007)
Rottenberg, J., Gross, J.: Emotion and emotion regulation: a map for psychotherapy researchers. Clin. Psychol. Sci. Pract. 14(4), 323–328 (2007)
Shackleton, M., Welsh, W.: Classification of facial features for recognition. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 1991 (Proceedings CVPR’91), pp. 573–579 (1991)
Shih, F., Chuang, C., Wang, P.: Performance comparisons of facial expression recognition in jaffe database. Int. J. Pattern Recogn. Artif. Intell. 22(3), 445–460 (2008)
Sloan, D., Kring, A.: Measuring changes in emotion during psychotherapy: conceptual and methodological issues. Clin. Psychol. Sci. Pract. 14(4), 307–322 (2007)
Smeulders, A., Worring, M., Santini, S., Gupta, A., Jain, R.: Content-based image retrieval at the end of the early years. IEEE Trans. Pattern Anal. Mach. Intell. 22(12), 1349–1380 (2000)
Suwa, M., Sugie, N., Fujimora, K.: A preliminary note on pattern recognition of human emotional expression. In: International Joint Conference on Pattern Recognition, pp. 408–410 (1978)
Terzopoulos, D., Waters, K.: Analysis and synthesis of facial image sequences using physical and anatomical models. IEEE Trans. Pattern Anal. Mach. Intell. 15(6), 569–579 (1993)
Wang, H., Huang, H., Ding, C.: Discriminant Laplacian Embedding. In: Twenty-Fourth AAAI Conference on Artificial Intelligence (2010)
Wang, H., Huang, H., Hu, Y., Anderson, M., Rollins, P., Makedon, F.: Emotion detection via discriminative kernel method. In: Proceedings of the 3rd International Conference on Pervasive Technologies Related to Assistive Environments, pp. 1–7. ACM (2010)
Yacoob, Y., Davis, L.: Computing spatio-temporal representations of human faces. In: IEEE Computer Society Conference On Computer Vision And Pattern Recognition, pp. 70–70. Citeseer (1994)
Zeng, Z., Hu, Y., Fu, Y., Huang, T., Roisman, G., Wen, Z.: Audio-visual emotion recognition in adult attachment interview. In: Proceedings of the 8th International Conference on Multimodal Interfaces, p. 145. ACM (2006)
Zhao, W., Chellappa, R., Phillips, P., Rosenfeld, A.: Face recognition: a literature survey. ACM Comput. Surv. (CSUR) 35(4), 399–458 (2003)
Acknowledgments
This research is supported by NSF IIS-1015219, NSF CCF-0830780, NSF CCF-0939187, NSF CCF-0917274, NSF DMS-0915228, NSF CNS-0923494, UTA-REP.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wang, H., Huang, H. & Makedon, F. Emotion Detection via Discriminant Laplacian Embedding. Univ Access Inf Soc 13, 23–31 (2014). https://doi.org/10.1007/s10209-013-0312-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10209-013-0312-5