Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
-
Updated
Feb 20, 2024 - Python
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Continual Hyperparameter Selection Framework. Compares 11 state-of-the-art Lifelong Learning methods and 4 baselines. Official Codebase of "A continual learning survey: Defying forgetting in classification tasks." in IEEE TPAMI.
PyTorch implementation of a VAE-based generative classifier, as well as other class-incremental learning methods that do not store data (DGR, BI-R, EWC, SI, CWR, CWR+, AR1, the "labels trick", SLDA).
This project investigates various continual learning methods to mitigate catastrophic forgetting
Implements regularization-based continual learning strategies that mitigate catastrophic forgetting by penalizing large parameter changes. Includes reproducible implementations of EWC, Synaptic Intelligence (SI), and Memory Aware Synapses (MAS) with experiment scripts and benchmark evaluations on Split-MNIST, Permuted-MNIST, and CIFAR-100.
Add a description, image, and links to the synaptic-intelligence topic page so that developers can more easily learn about it.
To associate your repository with the synaptic-intelligence topic, visit your repo's landing page and select "manage topics."