Models that contain the Model Concept : Attractor Neural Network

(A recurrently connected neural network with iterative activation update. A noisy initial pattern (of neural activity) will converge to one of a set of template patterns that is stored in the network by its weighted connections. Because of this property it is also a model of Associative Memory. The Continuous Attractor Network uses continuous activation values (~ firing rates), while the Binary Attractor Network uses {0,1} or {-1,1} as possible neuronal activations.)
Re-display model names without descriptions
    Models   Description
1.  Active dendritic integration in robust and precise grid cell firing (Schmidt-Hieber et al 2017)
"... Whether active dendrites contribute to the generation of the dual temporal and rate codes characteristic of grid cell output is unknown. We show that dendrites of medial entorhinal cortex neurons are highly excitable and exhibit a supralinear input–output function in vitro, while in vivo recordings reveal membrane potential signatures consistent with recruitment of active dendritic conductances. By incorporating these nonlinear dynamics into grid cell models, we show that they can sharpen the precision of the temporal code and enhance the robustness of the rate code, thereby supporting a stable, accurate representation of space under varying environmental conditions. Our results suggest that active dendrites may therefore constitute a key cellular mechanism for ensuring reliable spatial navigation."
2.  An attractor network model of grid cells and theta-nested gamma oscillations (Pastoll et al 2013)
A two population spiking continuous attractor model of grid cells. This model combines the attractor dynamics with theta-nested gamma oscillatory activity. It reproduces the behavioural response of grid cells (grid fields) in medial entorhinal cortex, while at the same time allowing for nested gamma oscillations of post-synaptic currents.
3.  Bump Attractor Models: Delayed Response & Recognition Span - spatial condition (Ibanez et al 2019)
The archive contains examples of two spatial working memory tasks: the Delayed Response Task (DRT) or oculomotor task & the Delayed Recognition Span Task in the spatial condition (DRSTsp).
4.  Ca+/HCN channel-dependent persistent activity in multiscale model of neocortex (Neymotin et al 2016)
"Neuronal persistent activity has been primarily assessed in terms of electrical mechanisms, without attention to the complex array of molecular events that also control cell excitability. We developed a multiscale neocortical model proceeding from the molecular to the network level to assess the contributions of calcium regulation of hyperpolarization-activated cyclic nucleotide-gated (HCN) channels in providing additional and complementary support of continuing activation in the network. ..."
5.  Cortex learning models (Weber at al. 2006, Weber and Triesch, 2006, Weber and Wermter 2006/7)
A simulator and the configuration files for three publications are provided. First, "A hybrid generative and predictive model of the motor cortex" (Weber at al. 2006) which uses reinforcement learning to set up a toy action scheme, then uses unsupervised learning to "copy" the learnt action, and an attractor network to predict the hidden code of the unsupervised network. Second, "A Self-Organizing Map of Sigma-Pi Units" (Weber and Wermter 2006/7) learns frame of reference transformations on population codes in an unsupervised manner. Third, "A possible representation of reward in the learning of saccades" (Weber and Triesch, 2006) implements saccade learning with two possible learning schemes for horizontal and vertical saccades, respectively.
6.  Feedforward heteroassociative network with HH dynamics (Lytton 1998)
Using the original McCulloch-Pitts notion of simple on and off spike coding in lieu of rate coding, an Anderson-Kohonen artificial neural network (ANN) associative memory model was ported to a neuronal network with Hodgkin-Huxley dynamics.
7.  Fixed point attractor (Hasselmo et al 1995)
"... In the model, cholinergic suppression of synaptic transmission at excitatory feedback synapses is shown to determine the extent to which activity depends upon new features of the afferent input versus components of previously stored representations. ..." See paper for more and details. The MATLAB script demonstrates the model of fixed point attractors mediated by excitatory feedback with subtractive inhibition in a continuous firing rate model.
8.  Fronto-parietal visuospatial WM model with HH cells (Edin et al 2007)
1) J Cogn Neurosci: 3 structural mechanisms that had been hypothesized to underlie vsWM development during childhood were evaluated by simulating the model and comparing results to fMRI. It was concluded that inter-regional synaptic connection strength cause vsWM development. 2) J Integr Neurosci: Given the importance of fronto-parietal connections, we tested whether connection asymmetry affected resistance to distraction. We drew the conclusion that stronger frontal connections are beneficial. By comparing model results to EEG, we concluded that the brain indeed has stronger frontal-to-parietal connections than vice versa.
9.  Generation of stable heading representations in diverse visual scenes (Kim et al 2019)
"Many animals rely on an internal heading representation when navigating in varied environments. How this representation is linked to the sensory cues that define different surroundings is unclear. In the fly brain, heading is represented by ‘compass’ neurons that innervate a ring-shaped structure known as the ellipsoid body. Each compass neuron receives inputs from ‘ring’ neurons that are selective for particular visual features; this combination provides an ideal substrate for the extraction of directional information from a visual scene. Here we combine two-photon calcium imaging and optogenetics in tethered flying flies with circuit modelling, and show how the correlated activity of compass and visual neurons drives plasticity, which flexibly transforms two-dimensional visual cues into a stable heading representation. ... " See the supplementary information for model details.
10.  Grid cell spatial firing models (Zilli 2012)
This package contains MATLAB implementations of most models (published from 2005 to 2011) of the hexagonal firing field arrangement of grid cells.
11.  Hierarchical network model of perceptual decision making (Wimmer et al 2015)
Neuronal variability in sensory cortex predicts perceptual decisions. To investigate the interaction of bottom-up and top-down mechanisms during the decision process, we developed a hierarchical network model. The network consists of two circuits composed of leaky integrate-and-fire neurons: an integration circuit (e.g. LIP, FEF) and a sensory circuit (MT), recurrently coupled via bottom-up feedforward connections and top-down feedback connections. The integration circuit accumulates sensory evidence and produces a binary categorization due to winner-take-all competition between two decision-encoding populations (X.J. Wang, Neuron, 2002). The sensory circuit is a balanced randomly connected EI-network, that contains neural populations selective to opposite directions of motion. We have used this model to simulate a standard two-alternative forced-choice motion discrimination task.
12.  High dimensional dynamics and low dimensional readouts in neural microcircuits (Haeusler et al 2006)
We investigate generic models for cortical microcircuits, i.e. recurrent circuits of integrate-and fire neurons with dynamic synapses. These complex dynamic systems subserve the amazing information processing capabilities of the cortex, but are at the present time very little understood. We analyze the transient dynamics of models for neural microcircuits from the point of view of one or two readout neurons that collapse the high dimensional transient dynamics of a neural circuit into a 1- or 2--dimensional output stream. See paper for more and details.
13.  Hopfield and Brody model (Hopfield, Brody 2000)
NEURON implementation of the Hopfield and Brody model from the papers: JJ Hopfield and CD Brody (2000) JJ Hopfield and CD Brody (2001). Instructions are provided in the below readme.txt file.
14.  Hopfield and Brody model (Hopfield, Brody 2000) (NEURON+python)
Demonstration of Hopfield-Brody snychronization using artificial cells in NEURON+python.
15.  Hyperconnectivity, slow synapses in PFC mental retardation and autism model (Testa-Silva et al 2011)
The subdirectory 'matlab' contains MATLAB scripts (The Mathworks, USA) that can be used to reproduce the panels of Figures 4-5. This directory contains files to reproduce sample computer simulations presented in the 2011 paper authored by Meredith, R., Testa-Silva, G., Loebel, A., Giugliano, M., de Kock, C.; Mansvelder, H. "Hyperconnectivity and slow synapses in prefrontal cortex of a model for mental retardation and autism". ABSTRACT "... We propose that these findings are tightly linked: using a network model, we show that slower synapses are essential to counterbalance hyperconnectivity in order to maintain a dynamic range of excitatory activity. However, the slow synaptic time constants induce decreased responsiveness to low frequency stimulation, which may explain deficits in integration and information processing in attentional neuronal networks in neurodevelopmental disorders."
16.  Leaky Integrate and Fire Neuron Model of Context Integration (Calvin, Redish accepted)
The maintenance of the contextual information has been shown to be sensitive to changes in excitation-inhbition (EI) balance. We constructed a multi-structure, biophysically-realistic agent that could perform context-integration as is assessed by the dot probe expectancy task. The agent included a perceptual network, a working memory network, and a decision making system and was capable of successfully performing the dot probe expectancy task. Systemic manipulation of the agent’s EI balance produced localized dysfunction of the memory structure, which resulted in schizophrenia-like deficits at context integration.
17.  MEC layer II stellate cell: Synaptic mechanisms of grid cells (Schmidt-Hieber & Hausser 2013)
This study investigates the cellular mechanisms of grid field generation in Medial Entorhinal Cortex (MEC) layer II stellate cells.
18.  Multistability of clustered states in a globally inhibitory network (Chandrasekaran et al. 2009)
"We study a network of m identical excitatory cells projecting excitatory synaptic connections onto a single inhibitory interneuron, which is reciprocally coupled to all excitatory cells through inhibitory synapses possessing short-term synaptic depression. We find that such a network with global inhibition possesses multiple stable activity patterns with distinct periods, characterized by the clustering of the excitatory cells into synchronized sub-populations. We prove the existence and stability of n-cluster solutions in a m-cell network. ... Implications for temporal coding and memory storage are discussed."
19.  Noise promotes independent control of gamma oscillations and grid firing (Solanka et al 2015)
"Neural computations underlying cognitive functions require calibration of the strength of excitatory and inhibitory synaptic connections and are associated with modulation of gamma frequency oscillations in network activity. However, principles relating gamma oscillations, synaptic strength and circuit computations are unclear. We address this in attractor network models that account for grid firing and theta-nested gamma oscillations in the medial entorhinal cortex. ..."
20.  Recurrent amplification of grid-cell activity (D'Albis and Kempter 2020)
21.  Self-organized olfactory pattern recognition (Kaplan & Lansner 2014)
" ... We present a large-scale network model with single and multi-compartmental Hodgkin–Huxley type model neurons representing olfactory receptor neurons (ORNs) in the epithelium, periglomerular cells, mitral/tufted cells and granule cells in the olfactory bulb (OB), and three types of cortical cells in the piriform cortex (PC). Odor patterns are calculated based on affinities between ORNs and odor stimuli derived from physico-chemical descriptors of behaviorally relevant real-world odorants. ... The PC was implemented as a modular attractor network with a recurrent connectivity that was likewise organized through Hebbian–Bayesian learning. We demonstrate the functionality of the model in a one-sniff-learning and recognition task on a set of 50 odorants. Furthermore, we study its robustness against noise on the receptor level and its ability to perform concentration invariant odor recognition. Moreover, we investigate the pattern completion capabilities of the system and rivalry dynamics for odor mixtures."
22.  Stable propagation of synchronous spiking in cortical neural networks (Diesmann et al 1999)
"... Here we show that precisely synchronized action potentials can propagate within a model of cortical network activity that recapitulates many of the features of biological systems. An attractor, yielding a stable spiking precision in the (sub)millisecond range, governs the dynamics of synchronization. Our results indicate that a combinatorial neural code, based on rapid associations of groups of neurons co-ordinating their activity at the single spike level, is possible within a cortical-like network."

Re-display model names without descriptions