Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3477145.3477155acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiconsConference Proceedingsconference-collections
short-paper

Neko: a Library for Exploring Neuromorphic Learning Rules

Published: 13 October 2021 Publication History

Abstract

The field of neuromorphic computing is in a period of active exploration. While many tools have been developed to simulate neuronal dynamics or convert deep networks to spiking models, general software libraries for learning rules remain underexplored. This is partly due to the diverse, challenging nature of efforts to design new learning rules, which range from encoding methods to gradient approximations, from population approaches that mimic the Bayesian brain to constrained learning algorithms deployed on memristor crossbars. To address this gap, we present Neko, a modular, extensible library with a focus on aiding the design of new learning algorithms. We demonstrate the utility of Neko in three exemplar cases: online local learning, probabilistic learning, and analog on-device learning. Our results show that Neko can replicate the state-of-the-art algorithms and, in one case, lead to significant outperformance in accuracy and speed. Further, it offers tools including gradient comparison that can help develop new algorithmic variants. Neko is an open source Python library that supports PyTorch and TensorFlow backends.

References

[1]
Martín Abadi, Paul Barham, Jianmin Chen, Zhifeng Chen, Andy Davis, Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Geoffrey Irving, Michael Isard, 2016. Tensorflow: A system for large-scale machine learning. In 12th {USENIX} symposium on operating systems design and implementation ({OSDI} 16). 265–283.
[2]
Laurence Aitchison, Jannes Jegminat, Jorge Aurelio Menendez, Jean-Pascal Pfister, Alexandre Pouget, and Peter E Latham. 2021. Synaptic plasticity as Bayesian inference. Nature Neuroscience 24, 4 (2021), 565–571.
[3]
Mohamed Akrout, Collin Wilson, Peter C Humphreys, Timothy Lillicrap, and Douglas Tweed. 2019. Deep learning without weight transport. arXiv preprint arXiv:1904.05391(2019).
[4]
Fabien Alibart, Ligang Gao, Brian D Hoskins, and Dmitri B Strukov. 2012. High precision tuning of state for memristive devices by adaptable variation-tolerant algorithm. Nanotechnology 23, 7 (Jan 2012), 075201. https://doi.org/10.1088/0957-4484/23/7/075201
[5]
Christophe Andrieu and Johannes Thoms. 2008. A tutorial on adaptive MCMC. Statistics and Computing 18, 4 (01 Dec 2008), 343–373. https://doi.org/10.1007/s11222-008-9110-y
[6]
Trevor Bekolay, James Bergstra, Eric Hunsberger, Travis DeWolf, Terrence C Stewart, Daniel Rasmussen, Xuan Choo, Aaron Voelker, and Chris Eliasmith. 2014. Nengo: a Python tool for building large-scale functional brain models. Frontiers in Neuroinformatics 7 (2014), 48.
[7]
Guillaume Bellec, Franz Scherr, Anand Subramoney, Elias Hajek, Darjan Salaj, Robert Legenstein, and Wolfgang Maass. 2020. A solution to the learning dilemma for recurrent networks of spiking neurons. Nature Communications 11, 1 (2020), 1–15.
[8]
Nicholas T Carnevale and Michael L Hines. 2006. The NEURON book. Cambridge University Press.
[9]
François Chollet 2015. Keras. https://keras.io.
[10]
Will Dabney, Zeb Kurth-Nelson, Naoshige Uchida, Clara Kwon Starkweather, Demis Hassabis, Rémi Munos, and Matthew Botvinick. 2020. A distributional code for value in dopamine-based reinforcement learning. Nature 577, 7792 (2020), 671–675.
[11]
Andrew P Davison, Daniel Brüderle, Jochen M Eppler, Jens Kremkow, Eilif Muller, Dejan Pecevski, Laurent Perrinet, and Pierre Yger. 2009. PyNN: a common interface for neuronal network simulators. Frontiers in Neuroinformatics 2 (2009), 11.
[12]
Julien Dupeyroux. 2021. A toolbox for neuromorphic sensing in robotics. arxiv:2103.02751 [cs.RO]
[13]
Elliot J Fuller, Scott T Keene, Armantas Melianas, Zhongrui Wang, Sapan Agarwal, Yiyang Li, Yaakov Tuchman, Conrad D James, Matthew J Marinella, J Joshua Yang, 2019. Parallel programming of an ionic floating-gate memory array for scalable neuromorphic computing. Science 364, 6440 (2019), 570–574.
[14]
Yarin Gal and Zoubin Ghahramani. 2016. Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning. In Proceedings of The 33rd International Conference on Machine Learning(Proceedings of Machine Learning Research, Vol. 48), Maria Florina Balcan and Kilian Q. Weinberger (Eds.). PMLR, New York, New York, USA, 1050–1059. http://proceedings.mlr.press/v48/gal16.html
[15]
J. Garofolo, Lori Lamel, W. Fisher, Jonathan Fiscus, D. Pallett, N. Dahlgren, and V. Zue. 1992. TIMIT Acoustic-phonetic Continuous Speech Corpus. Linguistic Data Consortium (11 1992).
[16]
Marc-Oliver Gewaltig and Markus Diesmann. 2007. Nest (neural simulation tool). Scholarpedia 2, 4 (2007), 1430.
[17]
Alex Graves. 2011. Practical Variational Inference for Neural Networks. In Proceedings of the 24th International Conference on Neural Information Processing Systems (Granada, Spain) (NIPS’11). Curran Associates Inc., Red Hook, NY, USA, 2348–2356.
[18]
Sam Greydanus. 2020. Scaling down Deep Learning. arxiv:2011.14439 [cs.LG]
[19]
Peter D. Hoff. 2009. A First Course in Bayesian Statistical Methods (1st ed.). Springer Publishing Company, Incorporated.
[20]
Miao Hu, Hai Li, Yiran Chen, Qing Wu, Garrett S Rose, and Richard W Linderman. 2014. Memristor crossbar-based neuromorphic computing system: A case study. IEEE Transactions on Neural Networks and Learning Systems 25, 10(2014), 1864–1878.
[21]
Mohammad Kachuee, Shayan Fazeli, and Majid Sarrafzadeh. 2018. Ecg heartbeat classification: A deep transferable representation. In 2018 IEEE International Conference on Healthcare Informatics (ICHI). IEEE, 443–444.
[22]
David C. Knill and Alexandre Pouget. 2004. The Bayesian brain: the role of uncertainty in neural coding and computation. Trends in Neurosciences 27, 12 (01 Dec 2004), 712–719. https://doi.org/10.1016/j.tins.2004.10.007
[23]
Yann LeCun. 1998. The MNIST database of handwritten digits. http://yann. lecun. com/exdb/mnist/(1998).
[24]
Jun Haeng Lee, Tobi Delbruck, and Michael Pfeiffer. 2016. Training deep spiking neural networks using backpropagation. Frontiers in Neuroscience 10 (2016), 508.
[25]
Can Li, Daniel Belkin, Yunning Li, Peng Yan, Miao Hu, Ning Ge, Hao Jiang, Eric Montgomery, Peng Lin, Zhongrui Wang, 2018. Efficient and self-adaptive in-situ learning in multilayer memristor neural networks. Nature communications 9, 1 (2018), 1–8.
[26]
Timothy P Lillicrap, Daniel Cownden, Douglas B Tweed, and Colin J Akerman. 2016. Random synaptic feedback weights support error backpropagation for deep learning. Nature communications 7, 1 (2016), 1–10.
[27]
Timothy P Lillicrap, Adam Santoro, Luke Marris, Colin J Akerman, and Geoffrey Hinton. 2020. Backpropagation and the brain. Nature Reviews Neuroscience 21, 6 (2020), 335–346.
[28]
Chit-Kwan Lin, Andreas Wild, Gautham N Chinya, Yongqiang Cao, Mike Davies, Daniel M Lavery, and Hong Wang. 2018. Programming spiking neural networks on Intel’s Loihi. Computer 51, 3 (2018), 52–61.
[29]
Owen Marschall, Kyunghyun Cho, and Cristina Savin. 2020. A unified framework of online learning algorithms for training recurrent neural networks. Journal of Machine Learning Research 21, 135 (2020), 1–34.
[30]
Radford M. Neal. 2011. MCMC Using Hamiltonian Dynamics. CRC Press. https://doi.org/10.1201/b10905-7
[31]
Emre O Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate gradient learning in spiking neural networks: Bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Processing Magazine 36, 6 (2019), 51–63.
[32]
Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, [n.d.]. PyTorch: An imperative style, high-performance deep learning library. arXiv preprint arXiv:1912.01703([n. d.]).
[33]
Balint Petro, Nikola Kasabov, and Rita M. Kiss. 2020. Selection and Optimization of Temporal Spike Encoding Methods for Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 31, 2 (Feb. 2020), 358–370. https://doi.org/10.1109/tnnls.2019.2906158
[34]
P. J. Rossky, J. D. Doll, and H. L. Friedman. 1978. Brownian dynamics as smart Monte Carlo simulation. The Journal of Chemical Physics 69, 10 (1978), 4628–4633. https://doi.org/10.1063/1.436415 arXiv:https://doi.org/10.1063/1.436415
[35]
Bodo Rueckauer, Connor Bybee, Ralf Goettsche, Yashwardhan Singh, Joyesh Mishra, and Andreas Wild. 2021. NxTF: An API and Compiler for Deep Spiking Neural Networks on Intel Loihi. arXiv preprint arXiv:2101.04261(2021).
[36]
Bodo Rueckauer and Shih-Chii Liu. 2018. Conversion of analog to spiking neural networks using sparse temporal coding. In 2018 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 1–5.
[37]
Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Frontiers in Neuroscience 11 (2017), 682.
[38]
João Sacramento, Rui Ponte Costa, Yoshua Bengio, and Walter Senn. 2018. Dendritic cortical microcircuits approximate the backpropagation algorithm. arXiv preprint arXiv:1810.11393(2018).
[39]
Jun Sawada, Filipp Akopyan, Andrew S Cassidy, Brian Taba, Michael V Debole, Pallab Datta, Rodrigo Alvarez-Icaza, Arnon Amir, John V Arthur, Alexander Andreopoulos, 2016. Truenorth ecosystem for brain-inspired computing: scalable systems, software, and applications. In SC’16: Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis. IEEE, 130–141.
[40]
Andrew Sornborger, Louis Tao, Jordan Snyder, and Anatoly Zlotnik. 2019. A pulse-gated, neural implementation of the backpropagation algorithm. In Proceedings of the 7th Annual Neuro-inspired Computational Elements Workshop. 1–9.
[41]
Marcel Stimberg, Romain Brette, and Dan FM Goodman. 2019. Brian 2, an intuitive and efficient neural simulator. eLife 8(2019), e47314.
[42]
Andy Thomas. 2013. Memristor-based neural networks. Journal of Physics D: Applied Physics 46, 9 (2013), 093001.
[43]
Peng Yao, Huaqiang Wu, Bin Gao, Jianshi Tang, Qingtian Zhang, Wenqiang Zhang, J Joshua Yang, and He Qian. 2020. Fully hardware-implemented memristor convolutional neural network. Nature 577, 7792 (2020), 641–646.
[44]
Elham Zamanidoost, Farnood M. Bayat, Dmitri Strukov, and Irina Kataeva. 2015. Manhattan rule training for memristive crossbar circuit pattern classifiers. In 2015 IEEE 9th International Symposium on Intelligent Signal Processing (WISP) Proceedings. 1–6. https://doi.org/10.1109/WISP.2015.7139171
[45]
Friedemann Zenke and Surya Ganguli. 2018. Superspike: Supervised learning in multilayer spiking neural networks. Neural computation 30, 6 (2018), 1514–1541.

Cited By

View all
  • (2024)Spike frequency adaptation: bridging neural models and neuromorphic applicationsCommunications Engineering10.1038/s44172-024-00165-93:1Online publication date: 1-Feb-2024
  • (2023)Easy and efficient spike-based Machine Learning with mlGeNNNeuro-Inspired Computational Elements Conference10.1145/3584954.3585001(115-120)Online publication date: 12-Apr-2023

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICONS 2021: International Conference on Neuromorphic Systems 2021
July 2021
198 pages
ISBN:9781450386913
DOI:10.1145/3477145
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 13 October 2021

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Bayesian inference
  2. Manhattan rule
  3. Neuromorphic computing
  4. approximate gradients
  5. learning rules
  6. open-source library

Qualifiers

  • Short-paper
  • Research
  • Refereed limited

Conference

ICONS 2021

Acceptance Rates

Overall Acceptance Rate 13 of 22 submissions, 59%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)25
  • Downloads (Last 6 weeks)4
Reflects downloads up to 13 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Spike frequency adaptation: bridging neural models and neuromorphic applicationsCommunications Engineering10.1038/s44172-024-00165-93:1Online publication date: 1-Feb-2024
  • (2023)Easy and efficient spike-based Machine Learning with mlGeNNNeuro-Inspired Computational Elements Conference10.1145/3584954.3585001(115-120)Online publication date: 12-Apr-2023

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media