Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3529399.3529441acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicmltConference Proceedingsconference-collections
research-article
Open access

Sparse Distributed Memory for Binary Sparse Distributed Representations

Published: 10 June 2022 Publication History

Abstract

Sparse Distributed Memory (SDM) and Binary Distributed Representations (BDR), as two phenomenological approaches to biological memory modelling, have a lot of common features. The idea of their integration in a hybrid semantic storage model with SDM used as the low (brain cell) level cleaning memory for BDR used as the high-level symbolic information coder seems natural. The hybrid semantic storage must be able to memorize holistic data (like the structures of interconnected and serialized key-value pairs) in a neural network. It has been proposed several times since 1990th. However, the earlier proposed models are not practical because of insufficient scalability and/or low storage density. The gap between SDM and BDR can be filled using the results of a 3rd theory dealing with sparse signals: Compressive Sensing or Sampling (CS). Such a hybrid semantic storage model is presented. We call it CS-SDM to reflect using a new CS-based SDM design as the cleaning memory for a Binary Sparse Distributed Representation (BSDR) of the holistic data. CS-SDM has been implemented on GPU and demonstrated much better capacity and denoising capabilities than classical SDM designs.

References

[1]
Pentti Kanerva. 1988. Sparse Distributed Memory. MIT Press, Cambridge, MA. 38-55.
[2]
Michael J. Flynn., Pentti Kanerva, Neal Bhadkamkar. 1989. Sparse Distributed Memory: Principles and Operation. Report CSL-TR-89-400, Research Institute for Advanced Computer Science (RIACS), NASA Research Centre at Ames. 29-32.
[3]
Pentti Kanerva. 1993. Sparse Distributed Memory and Related Models. Associative Neural Memories: Theory and Implementation. New York: Oxford University Press. 50-76.
[4]
Lewis A. Jaeckel. 1989. An Alternative Design for a Sparse Distributed Memory. Report TR 89.28, Research Institute for Advanced Computer Science (RIACS), NASA Research Centre at Ames. 13-20.
[5]
Lewis A. Jaeckel. 1989. A Class of Designs for a Sparse Distributed Memory. Report TR 89.30, Research Institute for Advanced Computer Science (RIACS), NASA Research Centre at Ames. 17-25.
[6]
David A. Marr. 1969. Theory of Cerebellar Cortex. The Journal of Physiology. Vol. 202, N 2. Trinity College, Cambridge. 437-470.
[7]
Derek J. Smith, Stephanie Forrest, Alan S. Perelson. 1998. Immunological memory is associative. Artificial Immune Systems and their Applications. Berlin: Springer. 105–112.
[8]
Kenny Schlegel, Peer Neubert, Peter Protzel. 2020. A comparison of Vector Symbolic Architectures. arXiv:2001.11797v3 [cs.AI] https://arxiv.org/abs/2001.11797.
[9]
Jerry A. Fodor, Zenon W. Pylyshyn. 1988. Connectionism and cognitive architecture: A critical analysis. Cognition. Vol. 28. 7–31.
[10]
Paul Smolensky. 1990. Tensor product variable binding and the representation of symbolic structures in connectionist systems. Artificial Intelligence. Vol. 46, No 1–2. 159–216.
[11]
Tony A. Plate. 1995. Holographic reduced representations. IEEE Transactions on Neural Networks. N 3. 41–59.
[12]
Pentti Kanerva. 1994. The binary spatter code for encoding concepts at many levels.: Proc. of Intern. Conference on Artificial Neural Networks ICANN ’94. Vol 1. London, Springer-Verlag. 226–229.
[13]
Pentti Kanerva. 2009. Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors. Cognitive Computation. Vol. 1, N 2. 139–159.
[14]
Gunnar Sjödin. 1998. The Sparchunk Code: A method to build higher-level structures in a sparsely encoded SDM. Proceedings of IEEE International Joint Conference on Neural Networks, IJCNN/WCCI’98. London: Springer. 50–58.
[15]
Ross W. Gayler. 1998. Multiplicative binding, representation operators & analogy. Advances in analogy research: Integration of theory and data from the cognitive, computational, and neural sciences. New Bulgarian University, Sofia. 1–4.
[16]
Dmitri A. Rachkovskij, Ernst M. Kussul. 2001. Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning. Neural Computation. Vol. 13, N 2. 411–452.
[17]
Dmitri A. Rachkovskij. 2001. Representation and processing of structures with binary sparse distributed codes. IEEE Transactions on Knowledge and Data Engineering. Vol. 13, N 2. 261–276.
[18]
Ross W. Gayler. 2003. Vector symbolic architectures answer Jackendoff's challenges for cognitive neuroscience. Proc. ICCS/ASCS International Conference on Cognitive Science. Sydney, CogPrints, University of New South Wales. 133–138.
[19]
E. Paxon Frady, Denis Kleyko, Friedrich T. Sommer. 2020. Variable Binding for Sparse Distributed Representations: Theory and Applications. arXiv:2009.06734v1 [cs.NE]. https://arxiv.org/abs/2009.06734
[20]
Dmitri A. Rachkovskij. 2019. Codevectors: Sparse Binary Distributed Representations of Numerical Data (in Russian). Interservice, Kyiv, Ukraine. 200 p.
[21]
Gunnar Sjödin. 1995. Convergence and new operations in SDM. SICS Research Report R95:13. Swedish Institute of Computer Science, Stockholm. 15 p.
[22]
Emmanuel J. Candès, Justin Romberg, Terence Tao. 2006. Stable signal recovery from incomplete and inaccurate measurements. Comm. Pure Appl. Math. Vol. 59, N 8. 1207–1223.
[23]
Emmanuel J. Candès, Michael B. Wakin. 2008. An introduction to compressive sampling. IEEE Signal Processing Magazine. Vol. 25, N 2. 21-30.
[24]
Tiago Ramalho, Marta Garnelo. 2019. Adaptive posterior learning: few-shot learning with a surprise-based memory module. Proceedings of the 7th International Conference on Learning Representations (ICLR, New Orleans, Louisiana, USA). https://arxiv.org/abs/1902.02527
[25]
George B. Dantzig. 1963. Linear programming and extensions. NJ, Princeton: Princeton University Press. 656.
[26]
Deanna Needell, Joel A. Tropp. 2009. CoSaMP: Iterative signal recovery from incomplete and inaccurate samples. Applied and Computational Harmonic Analysis. Vol. 26, N 3. 301–321.
[27]
Stephane Mallat, Zhifeng Zhang. 1993. Matching pursuits with time-frequency dictionaries. IEEE Transactions on Signal Processing. Vol. 41, N 12. 3397–3415.
[28]
Open-source library for CoSaMP algorithm: https://github.com/rfmiotto/CoSaMP/blob/master/cosamp.ipynb
[29]
Paul Virtanen, Ralf Gommers, Travis E. Oliphant 2020. SciPy 1.0: fundamental algorithms for scientific computing in Python. Nature Methods. Vol. 17, N 3. 261–272.
[30]
Linear programming module from SciPy library: https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.linprog.html
[31]
Ruslan O. Vdovychenko. 2021. The computer program “Hybrid neural memory model CS-SDM”. Copyright #104882, May 26. Ukrainian Intellectual Property Institute.
[32]
Open-source library CS-SDM: https://github.com/Rolandw0w/phd-sdm-cs.

Cited By

View all
  • (2024)Shift-Equivariant Similarity-Preserving Hypervector Representations of SequencesCognitive Computation10.1007/s12559-024-10258-416:3(909-923)Online publication date: 12-Mar-2024
  • (2022)Parallel Implementation of Sparse Distributed Memory for Semantic StorageCybernetics and Computer Technologies10.34229/2707-451X.22.2.6(58-66)Online publication date: 5-Oct-2022

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICMLT '22: Proceedings of the 2022 7th International Conference on Machine Learning Technologies
March 2022
291 pages
ISBN:9781450395748
DOI:10.1145/3529399
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 June 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Binary Sparse Distributed Representations
  2. Compressive Sampling
  3. Compressive Sensing
  4. GPU
  5. Sparse Distributed Memory
  6. Vector Symbolic Architecture
  7. associative memory
  8. neural networks

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

ICMLT 2022

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)133
  • Downloads (Last 6 weeks)10
Reflects downloads up to 23 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Shift-Equivariant Similarity-Preserving Hypervector Representations of SequencesCognitive Computation10.1007/s12559-024-10258-416:3(909-923)Online publication date: 12-Mar-2024
  • (2022)Parallel Implementation of Sparse Distributed Memory for Semantic StorageCybernetics and Computer Technologies10.34229/2707-451X.22.2.6(58-66)Online publication date: 5-Oct-2022

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media