Models that contain the Modeling Application : Brian (Home Page)

(Brian is a new simulator for spiking neural networks available on almost all platforms. The motivation for this project is that a simulator should not only save the time of processors, but also the time of scientists. Brian is easy to learn and use, highly flexible and easily extensible. The Brian package itself and simulations using it are all written in the Python programming language, which is an easy, concise and highly developed language with many advanced features and development tools, excellent documentation and a large community of users providing support and extension packages.)
Re-display model names without descriptions
    Models   Description
1.  A threshold equation for action potential initiation (Platkiewicz & Brette 2010)
"We examined in models the influence of Na channel activation, inactivation, slow voltage-gated channels and synaptic conductances on spike threshold. We propose a threshold equation which quantifies the contribution of all these mechanisms. It provides an instantaneous time-varying value of the threshold, which applies to neurons with fluctuating inputs. ... We find that spike threshold depends logarithmically on Na channel density, and that Na channel inactivation and K channels can dynamically modulate it in an adaptive way: the threshold increases with membrane potential and after every action potential. " See paper for more.
2.  Adaptive exponential integrate-and-fire model (Brette & Gerstner 2005)
"We introduce a two-dimensional integrate-and-fire model that combines an exponential spike mechanism with an adaptation equation, based on recent theoretical findings. ... The model is especially reliable in high-conductance states, typical of cortical activity in vivo, in which intrinsic conductances were found to have a reduced role in shaping spike trains. These results are promising because this simple model has enough expressive power to reproduce qualitatively several electrophysiological classes described in vitro."
3.  An attractor network model of grid cells and theta-nested gamma oscillations (Pastoll et al 2013)
A two population spiking continuous attractor model of grid cells. This model combines the attractor dynamics with theta-nested gamma oscillatory activity. It reproduces the behavioural response of grid cells (grid fields) in medial entorhinal cortex, while at the same time allowing for nested gamma oscillations of post-synaptic currents.
4.  Biophysical model for field potentials of networks of I&F neurons (beim Graben & Serafim 2013)
"... Starting from a reduced three-compartment model of a single pyramidal neuron, we derive an observation model for dendritic dipole currents in extracellular space and thereby for the dendritic field potential (DFP) that contributes to the local field potential (LFP) of a neural population. ... Our reduced three-compartment scheme allows to derive networks of leaky integrate-and-fire (LIF) models, which facilitates comparison with existing neural network and observation models. ..."
5.  Brain networks simulators - a comparative study (Tikidji-Hamburyan et al 2017)
" ... In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. ... we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package ..."
6.  Brette-Gerstner model (Touboul and Brette 2008)
Brian code to simulate the Brette-Gerstner model and reproduce the figures of Touboul and Brette, Biol Cyber (2008).
7.  CA1 network model for place cell dynamics (Turi et al 2019)
Biophysical model of CA1 hippocampal region. The model simulates place cells/fields and explores the place cell dynamics as function of VIP+ interneurons.
8.  CA1 network model: interneuron contributions to epileptic deficits (Shuman et al 2019)
Temporal lobe epilepsy causes significant cognitive deficits in both humans and rodents, yet the specific circuit mechanisms underlying these deficits remain unknown. There are profound and selective interneuron death and axonal reorganization within the hippocampus of both humans and animal models of temporal lobe epilepsy. To assess the specific contribution of these mechanisms on spatial coding, we developed a biophysically constrained network model of the CA1 region that consists of different subtypes of interneurons. More specifically, our network consists of 150 cells, 130 excitatory pyramidal cells and 20 interneurons (Fig. 1A). To simulate place cell formation in the network model, we generated grid cell and place cell inputs from the Entorhinal Cortex (ECLIII) and CA3 regions, respectively, activated in a realistic manner as observed when an animal transverses a linear track. Realistic place fields emerged in a subpopulation of pyramidal cells (40-50%), in which similar EC and CA3 grid cell inputs converged onto distal/proximal apical and basal dendrites. The tuning properties of these cells are very similar to the ones observed experimentally in awake, behaving animals To examine the role of interneuron death and axonal reorganization in the formation and/or tuning properties of place fields we selectively varied the contribution of each interneuron type and desynchronized the two excitatory inputs. We found that desynchronized inputs were critical in reproducing the experimental data, namely the profound reduction in place cell numbers, stability and information content. These results demonstrate that the desynchronized firing of hippocampal neuronal populations contributes to poor spatial processing in epileptic mice, during behavior. Given the lack of experimental data on the selective contributions of interneuron death and axonal reorganization in spatial memory, our model findings predict the mechanistic effects of these alterations at the cellular and network levels.
9.  CA1 PV+ fast-firing hippocampal interneuron (Ferguson et al. 2013)
This two-variable simple model is derived based on patch-clamp recordings from the CA1 region of a whole hippocampus preparation of PV+ fast-firing cells. Since basket cells, axo-axonic cells and bistratified cells can be PV+ and fast-firing, this model could be representative of these cell types. The model code will also be made available on OSB.
10.  CA1 pyramidal neuron (Ferguson et al. 2014)
Izhikevich-based models of CA1 pyramidal cells, with parameters constrained based on a whole hippocampus preparation. Strongly and weakly adapting models based on the experimental data have been developed. Code produces example model output. The code will also be made available on OSB.
11.  CA1 pyramidal neuron network model (Ferguson et al 2015)
From the paper: Figure 4 (1000 cell network) is reproduced by running this brian code. The raster plot and one of the excitatory cell voltage is produced.
12.  CA1 SOM+ (OLM) hippocampal interneuron (Ferguson et al. 2015)
This two-variable simple model is derived based on patch-clamp recordings from the CA1 region of a whole hippocampus preparation of SOM+ inhibitory cells. The model code will also be made available on OSB.
13.  CN bushy, stellate neurons (Rothman, Manis 2003) (Brian 2)
This model is an updated version of Romain Brette's adaptation of Rothman & Manis (2003). The model now uses Brian 2 instead of Brian 1 and can be configured to use n cells instead of a single cell. The included figure shows that Brian 2 is more efficient than Brian 1 once the number of cells exceeds 1,000.
14.  CN bushy, stellate neurons (Rothman, Manis 2003) (Brian)
Cochlear neuron model of Rothman & Manis (2003). Adapted from the Neuron implementation.
15.  Computing with neural synchrony (Brette 2012)
"... In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. ..."
16.  Cortical oscillations and the basal ganglia (Fountas & Shanahan 2017)
"Although brain oscillations involving the basal ganglia (BG) have been the target of extensive research, the main focus lies disproportionally on oscillations generated within the BG circuit rather than other sources, such as cortical areas. We remedy this here by investigating the influence of various cortical frequency bands on the intrinsic effective connectivity of the BG, as well as the role of the latter in regulating cortical behaviour. To do this, we construct a detailed neural model of the complete BG circuit based on fine-tuned spiking neurons, with both electrical and chemical synapses as well as short-term plasticity between structures. As a measure of effective connectivity, we estimate information transfer between nuclei by means of transfer entropy. Our model successfully reproduces firing and oscillatory behaviour found in both the healthy and Parkinsonian BG. We found that, indeed, effective connectivity changes dramatically for different cortical frequency bands and phase offsets, which are able to modulate (or even block) information flow in the three major BG pathways. ..."
17.  CRH modulates excitatory transmission and network physiology in hippocampus (Gunn et al. 2017)
This model simulates the effects of CRH on sharp waves in a rat CA1/CA3 model. It uses the frequency of the sharp waves as an output of the network.
18.  Dentate Gyrus model including Granule cells with dendritic compartments (Chavlis et al 2017)
Here we investigate the role of dentate granule cell dendrites in pattern separation. The model consists of point neurons (Integrate and fire) and in principal neurons, the granule cells, we have incorporated various number of dendrites.
19.  Diffusive homeostasis in a spiking network model (Sweeney et al. 2015)
In this paper we propose a new mechanism, diffusive homeostasis, in which neural excitability is modulated by nitric oxide, a gas which can flow freely across cell membranes. Our model simulates the activity-dependent synthesis and diffusion of nitric oxide in a recurrent network model of integrate-and-fire neurons. The concentration of nitric oxide is then used as homeostatic readout which modulates the firing threshold of each neuron.
20.  Effect of polysynaptic facilitaiton between piriform-hippocampal network stages (Trieu et al 2015)
This is a model of a multistage network with stages representing regions and synaptic contacts from the olfactory cortex to region CA1 of the hippocampus in Brian2 spiking neural network simulator (Trieu et al 2015). It is primarily designed to assess how synaptic facilitation at multiple stages in response to theta firing changes the output of the network. Further developments will be posted at: github.com/cdcox/multistage_network This model was prepared by Conor D Cox, University of California, Irvine For questions please contact Conor at cdcox1@gmail.com
21.  Fast global oscillations in networks of I&F neurons with low firing rates (Brunel and Hakim 1999)
Dynamics of a network of sparsely connected inhibitory current-based integrate-and-fire neurons. Individual neurons fire irregularly at low rate but the network is in an oscillatory global activity regime where neurons are weakly synchronized.
22.  Gamma-beta alternation in the olfactory bulb (David, Fourcaud-Trocmé et al., 2015)
This model, a simplified olfactory bulb network with mitral and granule cells, proposes a framework for two regimes of oscillation in the olfactory bulb: 1 - a weak inhibition regime (with no granule spike) where the network oscillates in the gamma (40-90Hz) band 2 - a strong inhibition regime (with granule spikes) where the network oscillates in the beta (15-30Hz) band. Slow modulations of sensory and centrifugal inputs, phase shifted by a quarter of cycle, possibly combined with short term depression of the mitral to granule AMPA synapse, allows the network to alternate between the two regimes as observed in anesthetized animals.
23.  Hierarchical network model of perceptual decision making (Wimmer et al 2015)
Neuronal variability in sensory cortex predicts perceptual decisions. To investigate the interaction of bottom-up and top-down mechanisms during the decision process, we developed a hierarchical network model. The network consists of two circuits composed of leaky integrate-and-fire neurons: an integration circuit (e.g. LIP, FEF) and a sensory circuit (MT), recurrently coupled via bottom-up feedforward connections and top-down feedback connections. The integration circuit accumulates sensory evidence and produces a binary categorization due to winner-take-all competition between two decision-encoding populations (X.J. Wang, Neuron, 2002). The sensory circuit is a balanced randomly connected EI-network, that contains neural populations selective to opposite directions of motion. We have used this model to simulate a standard two-alternative forced-choice motion discrimination task.
24.  High entrainment constrains synaptic depression in a globular bushy cell (Rudnicki & Hemmert 2017)
" ... Here we show how different levels of synaptic depression shape firing properties of GBCs in in vivo-like conditions using computer simulations. We analyzed how an interplay of synaptic depression (0 % to 70 %) and the number of auditory nerve fiber inputs (10 to 70) contributes to the variability of the experimental data from previous studies. ... Overall, this study helps to understand how synaptic properties shape temporal processing in the auditory system. It also integrates, compares, and reconciles results of various experimental studies."
25.  Impact of fast Na channel inact. on AP threshold & synaptic integration (Platkiewicz & Brette 2011)
Slope-threshold relationship with noisy inputs, in the adaptive threshold model.
26.  In vivo imaging of dentate gyrus mossy cells in behaving mice (Danielson et al 2017)
Mossy cells in the hilus of the dentate gyrus constitute a major excitatory principal cell type in the mammalian hippocampus, however, it remains unknown how these cells behave in vivo. Here, we have used two-photon Ca2+ imaging to monitor the activity of mossy cells in awake, behaving mice. We find that mossy cells are significantly more active than dentate granule cells in vivo, exhibit significant spatial tuning during head-fixed spatial navigation, and undergo robust remapping of their spatial representations in response to contextual manipulation. Our results provide the first characterization of mossy cells in the behaving animal and demonstrate their active participation in spatial coding and contextual representation.
27.  Inhibitory plasticity balances excitation and inhibition (Vogels et al. 2011)
"Cortical neurons receive balanced excitatory and inhibitory synaptic currents. Such a balance could be established and maintained in an experience-dependent manner by synaptic plasticity at inhibitory synapses. We show that this mechanism provides an explanation for the sparse firing patterns observed in response to natural stimuli and fits well with a recently observed interaction of excitatory and inhibitory receptive field plasticity. ... Our results suggest an essential role of inhibitory plasticity in the formation and maintenance of functional cortical circuitry."
28.  Input strength and time-varying oscillation peak frequency (Cohen MX 2014)
The purpose of this paper is to argue that a single neural functional principle—temporal fluctuations in oscillation peak frequency (“frequency sliding”)—can be used as a common analysis approach to bridge multiple scales within neuroscience. The code provided here recreates the network models used to demonstrate changes in peak oscillation frequency as a function of static and time-varying input strength, and also shows how correlated frequency sliding can be used to identify functional connectivity between two networks.
29.  Late emergence of the whisker direction selectivity map in rat barrel cortex (Kremer et al. 2011)
"... We discovered that the emergence of a direction map in rat barrel cortex occurs long after all known critical periods in the somatosensory system. This map is remarkably specific, taking a pinwheel-like form centered near the barrel center and aligned to the barrel cortex somatotopy. We suggest that this map may arise from intracortical mechanisms and demonstrate by simulation that the combination of spike-timing-dependent plasticity at synapses between layer 4 and layer 2/3 and realistic pad stimulation is sufficient to produce such a map. ..."
30.  Memory savings through unified pre- and postsynaptic STDP (Costa et al 2015)
Although it is well known that long-term synaptic plasticity can be expressed both pre- and postsynaptically, the functional consequences of this arrangement have remained elusive. We show that spike-timing-dependent plasticity with both pre- and postsynaptic expression develops receptive fields with reduced variability and improved discriminability compared to postsynaptic plasticity alone. These long-term modifications in receptive field statistics match recent sensory perception experiments. In these simulations we demonstrate that learning with this form of plasticity leaves a hidden postsynaptic memory trace that enables fast relearning of previously stored information, providing a cellular substrate for memory savings. Our results reveal essential roles for presynaptic plasticity that are missed when only postsynaptic expression of long-term plasticity is considered, and suggest an experience-dependent distribution of pre- and postsynaptic strength changes.
31.  Modeling epileptic seizure induced by depolarization block (Kim & Dykamp 2017)
"The inhibitory restraint necessary to suppress aberrant activity can fail when inhibitory neurons cease to generate action potentials as they enter depolarization block. We investigate possible bifurcation structures that arise at the onset of seizure-like activity resulting from depolarization block in inhibitory neurons. Networks of conductance based excitatory and inhibitory neurons are simulated to characterize different types of transitions to the seizure state, and a mean field model is developed to verify the generality of the observed phenomena of excitatory-inhibitory dynamics. ..."
32.  Network bursts in cultured NN result from different adaptive mechanisms (Masquelier & Deco 2013)
It is now well established that cultured neuron networks are spontaneously active, and tend to synchronize. Synchronous events typically involve the whole network, and have thus been termed “network spikes” (NS). Using experimental recordings and numerical simulations, we show here that the inter-NS interval statistics are complex, and allow inferring the neural mechanisms at work, in particular the adaptive ones, and estimating a number of parameters to which we cannot access experimentally.
33.  Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007)
This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005).
34.  Neural mass model based on single cell dynamics to model pathophysiology (Zandt et al 2014)
The model code as described in "A neural mass model based on single cell dynamics to model pathophysiology, Zandt et al. 2014, Journal of Computational Neuroscience" A Neural mass model (NMM) derived from single cell dynamics in a bottom up approach. Mean and standard deviation of the firing rates in the populations are calculated. The sigmoid is derived from the single cell FI-curve, allowing for easy implementation of pathological conditions. NMM is compared with a detailed spiking network model consisting of HH neurons. NMM code in Matlab. The network model is simulated using Norns (ModelDB # 154739)
35.  Oscillations, phase-of-firing coding and STDP: an efficient learning scheme (Masquelier et al. 2009)
The model demonstrates how a common oscillatory drive for a group of neurons formats and reliabilizes their spike times - through an activation-to-phase conversion - so that repeating activation patterns can be easily detected and learned by a downstream neuron equipped with STDP, and then recognized in just one oscillation cycle.
36.  Phase locking in leaky integrate-and-fire model (Brette 2004)
"This shows the phase-locking structure of a LIF driven by a sinusoidal current. When the current crosses the threshold (a<3), the model almost always phase locks (in a measure-theoretical sense)."
37.  Phase response curves firing rate dependency of rat purkinje neurons in vitro (Couto et al 2015)
NEURON implementation of stochastic gating in the Khaliq-Raman Purkinje cell model. NEURON implementation of the De Schutter and Bower model of a Purkinje Cell. Matlab scripts to compute the Phase Response Curve (PRC). LCG configuration files to experimentally determine the PRC. Integrate and Fire models (leaky and non-leaky) implemented in BRIAN to see the influence of the PRC in a network of unconnected neurons receiving sparse common input.
38.  Reliability of spike timing is a general property of spiking model neurons (Brette & Guigon 2003)
"... Here we show, through simulations and theoretical considerations, that for a general class of spiking neuron models, which includes, in particular, the leaky integrate-and-fire model as well as nonlinear spiking models, aperiodic currents, contrary to periodic currents, induce reproducible responses, which are stable under noise, change in initial conditions and deterministic perturbations of the input. We provide a theoretical explanation for aperiodic currents that cross the threshold."
39.  Robust modulation of integrate-and-fire models (Van Pottelbergh et al 2018)
"By controlling the state of neuronal populations, neuromodulators ultimately affect behavior. A key neuromodulation mechanism is the alteration of neuronal excitability via the modulation of ion channel expression. This type of neuromodulation is normally studied with conductance-based models, but those models are computationally challenging for large-scale network simulations needed in population studies. This article studies the modulation properties of the multiquadratic integrate-and-fire model, a generalization of the classical quadratic integrate-and-fire model. The model is shown to combine the computational economy of integrate-and-fire modeling and the physiological interpretability of conductance-based modeling. It is therefore a good candidate for affordable computational studies of neuromodulation in large networks."
40.  Sensitivity of noisy neurons to coincident inputs (Rossant et al. 2011)
"Two distant or coincident spikes are injected into a noisy balanced leaky integrate-and-fire neuron. The PSTH of the neuron in response to these inputs is calculated along with the extra number of spikes in the two cases. This number is higher for the coincident spikes, showing the sensitivity of a noisy neuron to coincident inputs."
41.  Spike-Timing-Based Computation in Sound Localization (Goodman and Brette 2010)
" ... In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. ..."
42.  Spontaneous weakly correlated excitation and inhibition (Tan et al. 2013)
Brian code for Tan et al. 2013.
43.  Stable propagation of synchronous spiking in cortical neural networks (Diesmann et al 1999)
"... Here we show that precisely synchronized action potentials can propagate within a model of cortical network activity that recapitulates many of the features of biological systems. An attractor, yielding a stable spiking precision in the (sub)millisecond range, governs the dynamics of synchronization. Our results indicate that a combinatorial neural code, based on rapid associations of groups of neurons co-ordinating their activity at the single spike level, is possible within a cortical-like network."
44.  STDP allows fast rate-modulated coding with Poisson-like spike trains (Gilson et al. 2011)
The model demonstrates that a neuron equipped with STDP robustly detects repeating rate patterns among its afferents, from which the spikes are generated on the fly using inhomogenous Poisson sampling, provided those rates have narrow temporal peaks (10-20ms) - a condition met by many experimental Post-Stimulus Time Histograms (PSTH).
45.  STDP and oscillations produce phase-locking (Muller et al. 2011)
"... In this note, we investigate a simple mechanism for learning precise LFP-to-spike coupling in feed-forward networks – the reliable, periodic modulation of presynaptic firing rates during oscillations, coupled with spike-timing dependent plasticity. When oscillations are within the biological range (2–150 Hz), firing rates of the inputs change on a timescale highly relevant to spike-timing dependent plasticity (STDP). Through analytic and computational methods, we find points of stable phase-locking for a neuron with plastic input synapses. These points correspond to precise phase-locking behavior in the feed-forward network. The location of these points depends on the oscillation frequency of the inputs, the STDP time constants, and the balance of potentiation and de-potentiation in the STDP rule. ..."
46.  Theory of arachnid prey localization (Sturzl et al. 2000)
"Sand scorpions and many other arachnids locate their prey through highly sensitive slit sensilla at the tips (tarsi) of their eight legs. This sensor array responds to vibrations with stimulus-locked action potentials encoding the target direction. We present a neuronal model to account for stimulus angle determination using a population of second-order neurons, each receiving excitatory input from one tarsus and inhibition from a triad opposite to it. ..."
47.  Time-warp-invariant neuronal processing (Gutig & Sompolinsky 2009)
" ... Here, we report that time-warp-invariant neuronal processing can be subserved by the shunting action of synaptic conductances that automatically rescales the effective integration time of postsynaptic neurons. We propose a novel spike-based learning rule for synaptic conductances that adjusts the degree of synaptic shunting to the temporal processing requirements of a given task. Applying this general biophysical mechanism to the example of speech processing, we propose a neuronal network model for time-warp-invariant word discrimination and demonstrate its excellent performance on a standard benchmark speech-recognition task. ..."
48.  Vectorized algorithms for spiking neural network simulation (Brette and Goodman 2011)
"... We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages."

Re-display model names without descriptions