Nothing Special   »   [go: up one dir, main page]

Intelligent synapses for multi-task and transfer learningDownload PDF

25 Nov 2024 (modified: 17 Feb 2017)ICLR 2017Readers: Everyone
Abstract: Deep learning has led to remarkable advances when applied to problems in which the data distribution does not change over the course of learning. In stark contrast, biological neural networks exhibit continual learning, solve a diversity of tasks simultaneously, and have no clear separation between training and evaluation phase. Furthermore, synapses in biological neurons are not simply real-valued scalars, but possess complex molecular machinery that enable non-trivial learning dynamics. In this study, we take a first step toward bringing this biological complexity into artificial neural networks. We introduce intelligent synapses that are capable of accumulating information over time, and exploiting this information to efficiently protect old memories from being overwritten as new problems are learned. We apply our framework to learning sequences of related classification problems, and show that it dramatically reduces catastrophic forgetting while maintaining computational efficiency.
TL;DR: Learn sequences of tasks in a unified network by preventing important weights from changing.
Conflicts: stanford.edu, google.com
3 Replies

Loading