Circuits that contain the Cell : Abstract integrate-and-fire leaky neuron

(A simple electrical model of a neuron first introduced in (Lapicque 1907))
Re-display model names without descriptions
    Models   Description
1. A full-scale cortical microcircuit spiking network model (Shimoura et al 2018)
Reimplementation in BRIAN 2 simulator of a full-scale cortical microcircuit containing two cell types (excitatory and inhibitory) distributed in four layers, and represents the cortical network below a surface of 1 mm² (Potjans & Diesmann, 2014).
2. A spiking neural network model of model-free reinforcement learning (Nakano et al 2015)
"Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. ... In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL (partially observable reinforcement learning) problems with high-dimensional observations. ... The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach. "
3. A spiking NN for amplification of feature-selectivity with specific connectivity (Sadeh et al 2015)
The model simulates large-scale inhibition-dominated spiking networks with different degrees of recurrent specific connectivity. It shows how feature-specific connectivity leads to a linear amplification of feedforward tuning, as reported in recent electrophysiological single-neuron recordings in rodent neocortex. Moreover, feature-specific connectivity leads to the emergence of feature-selective reverberating activity, and entails pattern completion in network responses.
4. Biophysical model for field potentials of networks of I&F neurons (beim Graben & Serafim 2013)
"... Starting from a reduced three-compartment model of a single pyramidal neuron, we derive an observation model for dendritic dipole currents in extracellular space and thereby for the dendritic field potential (DFP) that contributes to the local field potential (LFP) of a neural population. ... Our reduced three-compartment scheme allows to derive networks of leaky integrate-and-fire (LIF) models, which facilitates comparison with existing neural network and observation models. ..."
5. Dentate Gyrus model including Granule cells with dendritic compartments (Chavlis et al 2017)
Here we investigate the role of dentate granule cell dendrites in pattern separation. The model consists of point neurons (Integrate and fire) and in principal neurons, the granule cells, we have incorporated various number of dendrites.
6. Diffusive homeostasis in a spiking network model (Sweeney et al. 2015)
In this paper we propose a new mechanism, diffusive homeostasis, in which neural excitability is modulated by nitric oxide, a gas which can flow freely across cell membranes. Our model simulates the activity-dependent synthesis and diffusion of nitric oxide in a recurrent network model of integrate-and-fire neurons. The concentration of nitric oxide is then used as homeostatic readout which modulates the firing threshold of each neuron.
7. Dynamical patterns underlying response properties of cortical circuits (Keane et al 2018)
"Recent experimental studies show cortical circuit responses to external stimuli display varied dynamical properties. These include stimulus strength-dependent population response patterns, a shift from synchronous to asynchronous states and a decline in neural variability. To elucidate the mechanisms underlying these response properties and explore how they are mechanistically related, we develop a neural circuit model that incorporates two essential features widely observed in the cerebral cortex. The first feature is a balance between excitatory and inhibitory inputs to individual neurons; the second feature is distance-dependent connectivity. We show that applying a weak external stimulus to the model evokes a wave pattern propagating along lateral connections, but a strong external stimulus triggers a localized pattern; these stimulus strength-dependent population response patterns are quantitatively comparable with those measured in experimental studies. ..."
8. Excitatory and inhibitory population activity (Bittner et al 2017) (Litwin-Kumar & Doiron 2017)
"Many studies use population analysis approaches, such as dimensionality reduction, to characterize the activity of large groups of neurons. To date, these methods have treated each neuron equally, without taking into account whether neurons are excitatory or inhibitory. We studied population activity structure as a function of neuron type by applying factor analysis to spontaneous activity from spiking networks with balanced excitation and inhibition. Throughout the study, we characterized population activity structure by measuring its dimensionality and the percentage of overall activity variance that is shared among neurons. First, by sampling only excitatory or only inhibitory neurons, we found that the activity structures of these two populations in balanced networks are measurably different. We also found that the population activity structure is dependent on the ratio of excitatory to inhibitory neurons sampled. Finally we classified neurons from extracellular recordings in the primary visual cortex of anesthetized macaques as putative excitatory or inhibitory using waveform classification, and found similarities with the neuron type-specific population activity structure of a balanced network with excitatory clustering. These results imply that knowledge of neuron type is important, and allows for stronger statistical tests, when interpreting population activity structure."
9. Functional balanced networks with synaptic plasticity (Sadeh et al, 2015)
The model investigates the impact of learning on functional sensory networks. It uses large-scale recurrent networks of excitatory and inhibitory spiking neurons equipped with synaptic plasticity. It explains enhancement of orientation selectivity and emergence of feature-specific connectivity in visual cortex of rodents during development, as reported in experiments.
10. Gap junction plasticity as a mechanism to regulate network-wide oscillations (Pernelle et al 2018)
11. Grid cell oscillatory interference with noisy network oscillators (Zilli and Hasselmo 2010)
To examine whether an oscillatory interference model of grid cell activity could work if the oscillators were noisy neurons, we implemented these simulations. Here the oscillators are networks (either synaptically- or gap-junction--coupled) of one or more noisy neurons (either Izhikevich's simple model or a Hodgkin-Huxley--type biophysical model) which drive a postsynaptic cell (which may be integrate-and-fire, resonate-and-fire, or the simple model) which should fire spatially as a grid cell if the simulation is successful.
12. Grid cell spatial firing models (Zilli 2012)
This package contains MATLAB implementations of most models (published from 2005 to 2011) of the hexagonal firing field arrangement of grid cells.
13. Hierarchical network model of perceptual decision making (Wimmer et al 2015)
Neuronal variability in sensory cortex predicts perceptual decisions. To investigate the interaction of bottom-up and top-down mechanisms during the decision process, we developed a hierarchical network model. The network consists of two circuits composed of leaky integrate-and-fire neurons: an integration circuit (e.g. LIP, FEF) and a sensory circuit (MT), recurrently coupled via bottom-up feedforward connections and top-down feedback connections. The integration circuit accumulates sensory evidence and produces a binary categorization due to winner-take-all competition between two decision-encoding populations (X.J. Wang, Neuron, 2002). The sensory circuit is a balanced randomly connected EI-network, that contains neural populations selective to opposite directions of motion. We have used this model to simulate a standard two-alternative forced-choice motion discrimination task.
14. Hippocampal spiking model for context dependent behavior (Raudies & Hasselmo 2014)
Our model simulates the effect of context dependent behavior using discrete inputs to drive spiking activity representing place and item followed sequentially by a discrete representation of the motor actions involving a response to an item (digging for food) or the movement to a different item (movement to a different pot for food). This simple network was able to consistently learn the context-dependent responses.
15. I&F recurrent networks with current- or conductance-based synapses (Cavallari et al. 2014)
Recurrent networks of two populations (excitatory and inhibitory) of randomly connected Leaky Integrate-and-Fire (LIF) neurons with either current- or conductance-based synapses from the paper S. Cavallari, S. Panzeri and A. Mazzoni (2014)
16. Inhibitory cells enable sparse coding in V1 model (King et al. 2013)
" ... Here we show that adding a separate population of inhibitory neurons to a spiking model of V1 provides conformance to Dale’s Law, proposes a computational role for at least one class of interneurons, and accounts for certain observed physiological properties in V1. ... "
17. MDD: the role of glutamate dysfunction on Cingulo-Frontal NN dynamics (Ramirez-Mahaluf et al 2017)
" ...Currently, no mechanistic framework describes how network dynamics, glutamate, and serotonin interact to explain MDD symptoms and treatments. Here, we built a biophysical computational model of 2 areas (vACC and dlPFC) that can switch between emotional and cognitive processing. (Major Depression Disease) MDD networks were simulated by slowing glutamate decay in vACC and demonstrated sustained vACC activation. ..."
18. Models for cortical UP-DOWN states in a bistable inhibitory-stabilized network (Jercog et al 2017)
In the idling brain, neuronal circuits transition between periods of sustained firing (UP state) and quiescence (DOWN state), a pattern the mechanisms of which remain unclear. We analyzed spontaneous cortical population activity from anesthetized rats and found that UP and DOWN durations were highly variable and that population rates showed no significant decay during UP periods. We built a network rate model with excitatory (E) and inhibitory (I) populations exhibiting a novel bistable regime between a quiescent and an inhibition-stabilized state of arbitrarily low rate, where fluctuations triggered state transitions. In addition, we implemented these mechanisms in a more biophysically realistic spiking network, where DOWN-to-UP transitions are caused by synchronous high-amplitude events impinging onto the network.
19. Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007)
This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005).
20. Neural transformations on spike timing information (Tripp and Eliasmith 2007)
" ... Here we employ computational methods to show that an ensemble of neurons firing at a constant mean rate can induce arbitrarily chosen temporal current patterns in postsynaptic cells. ..."
21. Neuronify: An Educational Simulator for Neural Circuits (Dragly et al 2017)
"Neuronify, a new educational software application (app) providing an interactive way of learning about neural networks, is described. Neuronify allows students with no programming experience to easily build and explore networks in a plug-and-play manner picking network elements (neurons, stimulators, recording devices) from a menu. The app is based on the commonly used integrate-and-fire type model neuron and has adjustable neuronal and synaptic parameters. ..."
22. Norns - Neural Network Studio (Visser & Van Gils 2014)
The Norns - Neural Network Studio is a software package for designing, simulation and analyzing networks of spiking neurons. It consists of three parts: 1. "Urd": a Matlab frontend with high-level functions for quickly defining networks 2. "Verdandi": an optimized C++ simulation environment which runs the simulation defined by Urd 3. "Skuld": an advanced Matlab graphical user interface (GUI) for visual inspection of simulated data.
23. Olfactory Bulb mitral-granule network generates beta oscillations (Osinski & Kay 2016)
This model of the dendrodendritic mitral-granule synaptic network generates gamma and beta oscillations as a function of the granule cell excitability, which is represented by the granule cell resting membrane potential.
24. Optimal Localist and Distributed Coding Through STDP (Masquelier & Kheradpisheh 2018)
We show how a LIF neuron equipped with STDP can become optimally selective, in an unsupervised manner, to one or several repeating spike patterns, even when those patterns are hidden in Poisson spike trains.
25. Orientation selectivity in inhibition-dominated recurrent networks (Sadeh and Rotter, 2015)
Emergence of contrast-invariant orientation selectivity in large-scale networks of excitatory and inhibitory neurons using integrate-and-fire neuron models.
26. Oscillating neurons in the cochlear nucleus (Bahmer Langner 2006a, b, and 2007)
"Based on the physiological and anatomical data, we propose a model consisting of a minimum network of two choppers that are interconnected with a synaptic delay of 0.4 ms (Bahmer and Langner 2006a) . Such minimum delays have been found in different systems and in various animals (e.g. Hackett, Jackson, and Rubel 1982; Borst, Helmchen, and Sakmann 1995). The choppers receive input from both the auditory nerve and an onset neuron. This model can reproduce the mean, standard deviation, and coefficient of variation of the ISI and the dynamic features of AM coding of choppers."
27. Oscillations, phase-of-firing coding and STDP: an efficient learning scheme (Masquelier et al. 2009)
The model demonstrates how a common oscillatory drive for a group of neurons formats and reliabilizes their spike times - through an activation-to-phase conversion - so that repeating activation patterns can be easily detected and learned by a downstream neuron equipped with STDP, and then recognized in just one oscillation cycle.
28. Population models of temporal differentiation (Tripp and Eliasmith 2010)
"Temporal derivatives are computed by a wide variety of neural circuits, but the problem of performing this computation accurately has received little theoretical study. Here we systematically compare the performance of diverse networks that calculate derivatives using cell-intrinsic adaptation and synaptic depression dynamics, feedforward network dynamics, and recurrent network dynamics. Examples of each type of network are compared by quantifying the errors they introduce into the calculation and their rejection of high-frequency input noise. ..."
29. Relative spike time coding and STDP-based orientation selectivity in V1 (Masquelier 2012)
Phenomenological spiking model of the cat early visual system. We show how natural vision can drive spike time correlations on sufficiently fast time scales to lead to the acquisition of orientation-selective V1 neurons through STDP. This is possible without reference times such as stimulus onsets, or saccade landing times. But even when such reference times are available, we demonstrate that the relative spike times encode the images more robustly than the absolute ones.
30. SHOT-CA3, RO-CA1 Training, & Simulation CODE in models of hippocampal replay (Nicola & Clopath 2019)
In this code, we model the interaction between the medial septum and hippocampus as a FORCE trained, dual oscillator model. One oscillator corresponds to the medial septum and serves as an input, while a FORCE trained network of LIF neurons acts as a model of the CA3. We refer to this entire model as the Septal Hippocampal Oscillator Theta (or SHOT) network. The code contained in this upload allows a user to train a SHOT network, train a population of reversion interneurons, and simulate the SHOT-CA3 and RO-CA1 networks after training. The code scripts are labeled to correspond to the figure from the manuscript.
31. Spike burst-pause dynamics of Purkinje cells regulate sensorimotor adaptation (Luque et al 2019)
"Cerebellar Purkinje cells mediate accurate eye movement coordination. However, it remains unclear how oculomotor adaptation depends on the interplay between the characteristic Purkinje cell response patterns, namely tonic, bursting, and spike pauses. Here, a spiking cerebellar model assesses the role of Purkinje cell firing patterns in vestibular ocular reflex (VOR) adaptation. The model captures the cerebellar microcircuit properties and it incorporates spike-based synaptic plasticity at multiple cerebellar sites. ..."
32. Spiking neuron model of the basal ganglia (Humphries et al 2006)
A spiking neuron model of the basal ganglia (BG) circuit (striatum, STN, GP, SNr). Includes: parallel anatomical channels; tonic dopamine; dopamine receptors in striatum, STN, and GP; burst-firing in STN; GABAa, AMPA, and NMDA currents; effects of synaptic location. Model demonstrates selection and switching of input signals. Replicates experimental data on changes in slow-wave (<1 Hz) and gamma-band oscillations within BG nuclei following lesions and pharmacological manipulations.
33. STDP allows fast rate-modulated coding with Poisson-like spike trains (Gilson et al. 2011)
The model demonstrates that a neuron equipped with STDP robustly detects repeating rate patterns among its afferents, from which the spikes are generated on the fly using inhomogenous Poisson sampling, provided those rates have narrow temporal peaks (10-20ms) - a condition met by many experimental Post-Stimulus Time Histograms (PSTH).
34. Structure-dynamics relationships in bursting neuronal networks revealed (Mäki-Marttunen et al. 2013)
This entry includes tools for generating and analyzing network structure, and for running the neuronal network simulations on them.
35. Supervised learning in spiking neural networks with FORCE training (Nicola & Clopath 2017)
The code contained in the zip file runs FORCE training for various examples from the paper: Figure 2 (Oscillators and Chaotic Attractor) Figure 3 (Ode to Joy) Figure 4 (Song Bird Example) Figure 5 (Movie Example) Supplementary Figures 10-12 (Classifier) Supplementary Ode to Joy Example Supplementary Figure 2 (Oscillator Panel) Supplementary Figure 17 (Long Ode to Joy) Note that due to file size limitations, the supervisors for Figures 4/5 are not included. See Nicola, W., & Clopath, C. (2016). Supervised Learning in Spiking Neural Networks with FORCE Training. arXiv preprint arXiv:1609.02545. for further details.
36. Universal feature of developing networks (Tabak et al 2010)
"Spontaneous episodic activity is a fundamental mode of operation of developing networks. Surprisingly, the duration of an episode of activity correlates with the length of the silent interval that precedes it, but not with the interval that follows. ... We thus developed simple models incorporating excitatory coupling between heterogeneous neurons and activity-dependent synaptic depression. These models robustly generated episodic activity with the correct correlation pattern. The correlation pattern resulted from episodes being triggered at random levels of recovery from depression while they terminated around the same level of depression. To explain this fundamental difference between episode onset and termination, we used a mean field model, where only average activity and average level of recovery from synaptic depression are considered. ... This work further shows that networks with widely different architectures, different cell types, and different functions all operate according to the same general mechanism early in their development."

Re-display model names without descriptions