Abstract
Collective dynamics of the neural population are involved in a variety of cognitive functions. How such neural dynamics are shaped through learning and how the learning performance is related to the property of the individual neurons are fundamental questions in neuroscience. Previous model studies answered these questions by using developing machine-learning techniques for training a recurrent neural network. However, these techniques are not biologically plausible. Does another type of learning method, for instance, a more biologically plausible learning method, shape the similar neural dynamics and the similar relation between the learning performance and the property of the individual neurons to those observed in the previous studies? In this study, we have used the recently proposed learning model with multiple timescales in the neural activity, which is more biologically plausible, and analyzed the neural dynamics and the relation regarding the sensitivity of neurons. As result, we have found that our model shapes similar neural dynamics and the relation. Further, the intermediate sensitivity of neurons that is optimal for the learning speed generates a variety of neural activity patterns in line with the experimental observations in the neural system. This result suggests that the neural system might develop the sensitivity of neural activities to optimize the learning speed through evolution.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chaisangmongkon, W., Swaminathan, S.K., Freedman, D.J., Wang, X.J.J.: Computing by robust transience: how the fronto-parietal network performs sequential. Category-based decisions. Neuron 93(6), 1504-1517.e4 (2017)
Chialvo, D.R.: Emergent complex neural dynamics. Nat. Phys. 6(10), 744–750 (2010)
Fusi, S., Miller, E.K., Rigotti, M.: Why neurons mix: high dimensionality for higher cognition. Curr. Opinion Neurobiol. 37, 66–74 (2016)
Ghazizadeh, E., Ching, S.: Slow manifolds in recurrent networks encode working memory efficiently and robustly. arXiv preprint 2101.03163, January 2021
Kurikawa, T.: Transitions among metastable states underlie context-dependent working memories in a multiple timescale network. arXiv preprint 2104.10829, April 2021
Kurikawa, T., Barak, O., Kaneko, K.: Repeated sequential learning increases memory capacity via effective decorrelation in a recurrent neural network. Phys. Rev. Res. 2(2), 023307 (2020)
Kurikawa, T., Haga, T., Handa, T., Harukuni, R., Fukai, T.: Neuronal stability in medial frontal cortex sets individual variability in decision-making. Nat. Neurosci. 21(12), 1764–1773 (2018)
Kurikawa, T., Kaneko, K.: Embedding responses in spontaneous neural activity shaped through sequential learning. PLoS Comput. Biol. 9(3), e1002943 (2013)
Kurikawa, T., Kaneko, K.: Dynamic organization of hierarchical memories. PLoS ONE 11(9), e0162640 (2016)
Kurikawa, T., Kaneko, K.: Multiple-timescale neural networks: generation of context-dependent sequences and inference through autonomous bifurcations. arXiv preprint, p. 2006.03887, June 2020
Kurikawa, T., Mizuseki, K., Fukai, T.: Oscillation-driven memory encoding, maintenance, and recall in an EntorhinalHippocampal circuit model. Cerebral Cortex 31(4), 2038–2057 (2021)
Orhan, A.E., Ma, W.J.: A diverse range of factors affect the nature of neural representations underlying short-term memory. Nat. Neurosci. 22(2), 275–283 (2019)
Stokes, M.G., Kusunoki, M., Sigala, N., Nili, H., Gaffan, D., Duncan, J.: Dynamic coding for cognitive control in prefrontal cortex. Neuron 78(2), 364–375 (2013)
Sussillo, D., Abbott, L.F.: Generating coherent patterns of activity from chaotic neural networks. Neuron 63(4), 544–57 (2009)
Acknowledgments
We thank Kunihiko Kaneko for fruitful discussion for our manuscript. This work was partly support by JSPS KAKENHI (no. 20H00123).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Kurikawa, T. (2021). Intermediate Sensitivity of Neural Activities Induces the Optimal Learning Speed in a Multiple-Timescale Neural Activity Model. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Communications in Computer and Information Science, vol 1517. Springer, Cham. https://doi.org/10.1007/978-3-030-92310-5_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-92310-5_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-92309-9
Online ISBN: 978-3-030-92310-5
eBook Packages: Computer ScienceComputer Science (R0)