Alleviating catastrophic forgetting: context gating and synaptic stabilization (Masse et al 2018)


Help downloading and running models
Accession:256370
"Artificial neural networks can suffer from catastrophic forgetting, in which learning a new task causes the network to forget how to perform previous tasks. While previous studies have proposed various methods that can alleviate forgetting over small numbers (<10) of tasks, it is uncertain whether they can prevent forgetting across larger numbers of tasks. In this study, we propose a neuroscience-inspired scheme, called “context-dependent gating,” in which mostly nonoverlapping sets of units are active for any one task. Importantly, context-dependent gating has a straightforward implementation, requires little extra computational overhead, and when combined with previous methods to stabilize connection weights, can allow networks to maintain high performance across large numbers of sequentially presented tasks."
Reference:
1 . Masse NY, Grant GD, Freedman DJ (2018) Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Proc Natl Acad Sci U S A 115:E10467-E10475 [PubMed]
Citations  Citation Browser
Model Information (Click on a link to find other models with that property)
Model Type: Connectionist Network;
Brain Region(s)/Organism:
Cell Type(s):
Channel(s):
Gap Junctions:
Receptor(s):
Gene(s):
Transmitter(s):
Simulation Environment: Python (web link to model);
Model Concept(s): Learning; Reinforcement Learning;
Implementer(s): Masse, Nicolas Y [masse at uchicago.edu]; Grant, Gregory D [dfreedman at uchicago.edu];
(located via links below)