Nothing Special   »   [go: up one dir, main page]

Skip to main content

Physiologically-Inspired Neural Circuits for the Recognition of Dynamic Faces

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2020 (ICANN 2020)

Abstract

Dynamic faces are essential for the communication of humans and non-human primates. However, the exact neural circuits of their processing remain unclear. Based on previous models for cortical neural processes involved for social recognition (of static faces and dynamic bodies), we propose two alternative neural models for the recognition of dynamic faces: (i) an example-based mechanism that encodes dynamic facial expressions as sequences of learned keyframes using a recurrent neural network (RNN), and (ii) a norm-based mechanism, relying on neurons that represent differences between the actual facial shape and the neutral facial pose. We tested both models exploiting highly controlled facial monkey expressions, generated using a photo-realistic monkey avatar that was controlled by motion capture data from monkeys. We found that both models account for the recognition of normal and temporally reversed facial expressions from videos. However, if tested with expression morphs, and with expressions of reduced strength, both models made quite different prediction, the norm-based model showing an almost linear variation of the neuron activities with the expression strength and the morphing level for cross-expression morphs, while the example based model did not generalize well to such stimuli. These predictions can be tested easily in electrophysiological experiments, exploiting the developed stimulus set.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Barraclough, N.E., Perrett, D.I.: From single cells to social perception. Philos. Trans. R. Soc. B: Biol. Sci. 366(1571), 1739–1752 (2011)

    Article  Google Scholar 

  2. Ghazanfar, A.A., Chandrasekaran, C., Morrill, R.J.: Dynamic, rhythmic facial expressions and the superior temporal sulcus of macaque monkeys: implications for the evolution of audiovisual speech. Eur. J. Neurosci. 31(10), 1807–1817 (2010)

    Article  Google Scholar 

  3. Mosher, C.P., Zimmerman, P.E., Gothard, K.M.: Neurons in the monkey amygdala detect eye contact during naturalistic social interactions. Curr. Biol. 24(20), 2459–2464 (2014)

    Article  Google Scholar 

  4. Leopold, D.A., Bondar, I.V., Giese, M.A.: Norm-based face encoding by single neurons in the monkey inferotemporal cortex. Nature 442(7102), 572–575 (2006)

    Article  Google Scholar 

  5. Giese, M.A., Leopold, D.A.: Physiologically inspired neural model for the encoding of face spaces. Neurocomputing 65, 93–101 (2005)

    Article  Google Scholar 

  6. Caggiano, V., Fleischer, F., Pomper, J.K., Giese, M.A., Thier, P.: Mirror neurons in monkey premotor area F5 show tuning for critical features of visual causality perception. Curr. Biol. 26(22), 3077–3082 (2016)

    Article  Google Scholar 

  7. Fleischer, F., Caggiano, V., Thier, P., Giese, M.A.: Physiologically inspired model for the visual recognition of transitive hand actions. J. Neurosci. 33(15), 6563–6580 (2013)

    Article  Google Scholar 

  8. Serre, T., Wolf, L., Bileschi, S., Riesenhuber, M., Poggio, T.: Robust object recognition with cortex-like mechanisms. IEEE Trans. Pattern Anal. Mach. Intell. 29(3), 411–426 (2007)

    Article  Google Scholar 

  9. Giese, M.A., Poggio, T.: Neural mechanisms for the recognition of biological movements. Nat. Rev. Neurosci. 4(3), 179–192 (2003)

    Article  Google Scholar 

  10. Lange, J., Lappe, M.: A model of biological motion perception from configural form cues. J. Neurosci. 26(11), 2894–2906 (2006)

    Article  Google Scholar 

  11. Jhuang, H., Serre, T., Wolf, L., Poggio, T.: A biologically inspired system for action recognition. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007)

    Google Scholar 

  12. Haxby, J.V., Hoffman, E.A., Gobbini, M.I.: The distributed human neural system for face perception. Trends Cogn. Sci. 4(6), 223–233 (2000)

    Article  Google Scholar 

  13. Valentine, T., Lewis, M.B., Hills, P.J.: Face-space: a unifying concept in face recognition research. Q. J. Exp. Psychol. 69(10), 1996–2019 (2016)

    Article  Google Scholar 

  14. Leopold, D.A., Rhodes, G.: A comparative view of face perception. J. Comp. Psychol. 124(3), 233 (2010)

    Article  Google Scholar 

  15. Li, S., Deng, W.: Deep facial expression recognition: a survey. IEEE Trans. Affect. Comput. (2020)

    Google Scholar 

  16. Schrimpf, M., et al.: Brain-score: which artificial neural network for object recognition is most brain-like? BioRxiv, p. 407007 (2018)

    Google Scholar 

  17. Jones, J.P., Palmer, L.A.: The two-dimensional spatial structure of simple receptive fields in cat striate cortex. J. Neurophysiol. 58(6), 1187–1211 (1987)

    Article  Google Scholar 

  18. Amari, S.: Dynamics of pattern formation in lateral-inhibition type neural fields. Biol. Cybern. 27(2), 77–87 (1977)

    Article  MathSciNet  Google Scholar 

  19. Ratté, S., Lankarany, M., Rho, Y.-A., Patterson, A., Prescott, S.A.: Subthreshold membrane currents confer distinct tuning properties that enable neurons to encode the integral or derivative of their input. Front. Cell. Neurosci. 8, 452 (2015)

    Google Scholar 

  20. Taubert, N., Christensen, A., Endres, D., Giese, M.A.: Online simulation of emotional interactive behaviors with hierarchical Gaussian process dynamical models. In: Proceedings of the ACM Symposium on Applied Perception, pp. 25–32 (2012)

    Google Scholar 

  21. Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12(1), 43–77 (1994)

    Article  Google Scholar 

  22. Siebert, R., Taubert, N., Spadacenta, S., Dicke, P.W., Giese, M.A., Thier, P.: A naturalistic dynamic monkey head avatar elicits species-typical reactions and overcomes the uncanny valley. Eneuro 7(4) (2020)

    Google Scholar 

Download references

Acknowledgements

This work was supported by HFSP RGP0036/2016 and EC CogIMon H2020 ICT-23-2014/644727. It was also supported by BMBF FKZ 01GQ1704, BW-Stiftung NEU007/1 KONSENS-NHE, ERC 2019-SyG-RELEVANCE-856495 and the Deutsche Forschungsgesellschaft Grant TH425/12-2. NVIDIA Corp.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michael Stettler .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Stettler, M. et al. (2020). Physiologically-Inspired Neural Circuits for the Recognition of Dynamic Faces. In: Farkaš, I., Masulli, P., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2020. ICANN 2020. Lecture Notes in Computer Science(), vol 12396. Springer, Cham. https://doi.org/10.1007/978-3-030-61609-0_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-61609-0_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-61608-3

  • Online ISBN: 978-3-030-61609-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics