Circuits that contain the Cell : Abstract integrate-and-fire adaptive exponential (AdEx) neuron

(The adaptive exponential integrate and fire neuron (AdEx) was introduced in Brette R, Gerstner W (2005) )
Re-display model names without descriptions
    Models   Description
1. A detailed data-driven network model of prefrontal cortex (Hass et al 2016)
Data-based PFC-like circuit with layer 2/3 and 5, synaptic clustering, four types of interneurons and cell-type specific short-term synaptic plasticity; neuron parameters fitted to in vitro data, all other parameters constrained by experimental literature. Reproduces key features of in vivo resting state activity without specific tuning.
2. A spatial model of the intermediate superior colliculus (Moren et. al. 2013)
A spatial model of the intermediate superior colliculus. It reproduces the collicular saccade-generating output profile from NMDA receptor-driven burst neurons, shaped by integrative inhibitory feedback from spreading buildup neuron activity. The model is consistent with the view that collicular activity directly shapes the temporal profile of saccadic eye movements. We use the Adaptive exponential integrate and fire neuron model, augmented with an NMDA-like membrane potential-dependent receptor. In addition, we use a synthetic spike integrator model as a stand-in for a spike-integrator circuit in the reticular formation. NOTE: We use a couple of custom neuron models, so the supplied model file includes an entire version of NEST. I also include a patch that applies to a clean version of the simulator (see the doc/README).
3. An electrophysiological model of GABAergic double bouquet cells (Chrysanthidis et al. 2019)
We present an electrophysiological model of double bouquet cells (DBCs) and integrate them into an established cortical columnar microcircuit model that implements a BCPNN (Bayesian Confidence Propagation Neural Network) learning rule. The proposed architecture effectively solves the problem of duplexed learning of inhibition and excitation by replacing recurrent inhibition between pyramidal cells in functional columns of different stimulus selectivity with a plastic disynaptic pathway. The introduction of DBCs improves the biological plausibility of our model, without affecting the model's spiking activity, basic operation, and learning abilities.
4. Asynchronous irregular and up/down states in excitatory and inhibitory NNs (Destexhe 2009)
"Randomly-connected networks of integrate-and-fire (IF) neurons are known to display asynchronous irregular (AI) activity states, which resemble the discharge activity recorded in the cerebral cortex of awake animals. ... Here, we investigate the occurrence of AI states in networks of nonlinear IF neurons, such as the adaptive exponential IF (Brette-Gerstner-Izhikevich) model. This model can display intrinsic properties such as low-threshold spike (LTS), regular spiking (RS) or fast-spiking (FS). We successively investigate the oscillatory and AI dynamics of thalamic, cortical and thalamocortical networks using such models. ..."
5. Dentate Gyrus model including Granule cells with dendritic compartments (Chavlis et al 2017)
Here we investigate the role of dentate granule cell dendrites in pattern separation. The model consists of point neurons (Integrate and fire) and in principal neurons, the granule cells, we have incorporated various number of dendrites.
6. Input strength and time-varying oscillation peak frequency (Cohen MX 2014)
The purpose of this paper is to argue that a single neural functional principle—temporal fluctuations in oscillation peak frequency (“frequency sliding”)—can be used as a common analysis approach to bridge multiple scales within neuroscience. The code provided here recreates the network models used to demonstrate changes in peak oscillation frequency as a function of static and time-varying input strength, and also shows how correlated frequency sliding can be used to identify functional connectivity between two networks.
7. Large-scale model of neocortical slice in vitro exhibiting persistent gamma (Tomsett et al. 2014)
This model contains 15 neuron populations (8 excitatory, 7 inhibitory) arranged into 4 cortical layers (layer 1 empty, layers 2/3 combined). It produces a persistent gamma oscillation driven by layer 2/3. It runs using the VERTEX simulator, which is written in Matlab and is available from http://www.vertexsimulator.org
8. Linking dynamics of the inhibitory network to the input structure (Komarov & Bazhenov 2016)
Code to model 10 all-to-all coupled inhibitory neurons.
9. Mean Field Equations for Two-Dimensional Integrate and Fire Models (Nicola and Campbell, 2013)
The zip file contains the files used to perform numerical simulation and bifurcation studies of large networks of two-dimensional integrate and fire neurons and of the corresponding mean field models derived in our paper. The neural models used are the Izhikevich model and the Adaptive Exponential model.
10. Mesoscopic dynamics from AdEx recurrent networks (Zerlaut et al., JCNS 2017)
We present a mean-field model of networks of Adaptive Exponential (AdEx) integrate-and-fire neurons, with conductance-based synaptic interactions. We study a network of regular-spiking (RS) excitatory neurons and fast-spiking (FS) inhibitory neurons. We use a Master Equation formalism, together with a semi-analytic approach to the transfer function of AdEx neurons to describe the average dynamics of the coupled populations. We compare the predictions of this mean-field model to simulated networks of RS-FS cells, first at the level of the spontaneous activity of the network, which is well predicted by the analytical description. Second, we investigate the response of the network to time-varying external input, and show that the mean-field model predicts the response time course of the population. Finally, to model VSDi signals, we consider a one-dimensional ring model made of interconnected RS-FS mean-field units.
11. Network bursts in cultured NN result from different adaptive mechanisms (Masquelier & Deco 2013)
It is now well established that cultured neuron networks are spontaneously active, and tend to synchronize. Synchronous events typically involve the whole network, and have thus been termed “network spikes” (NS). Using experimental recordings and numerical simulations, we show here that the inter-NS interval statistics are complex, and allow inferring the neural mechanisms at work, in particular the adaptive ones, and estimating a number of parameters to which we cannot access experimentally.
12. Neuron-based control mechanisms for a robotic arm and hand (Singh et al 2017)
"A robotic arm and hand controlled by simulated neurons is presented. The robot makes use of a biological neuron simulator using a point neural model. ... The robot performs a simple pick-and-place task. ... As another benefit, it is hoped that further work will also lead to a better understanding of human and other animal neural processing, particularly for physical motion. This is a multidisciplinary approach combining cognitive neuroscience, robotics, and psychology."
13. Oscillations emerging from noise-driven NNs (Tchumatchenko & Clopath 2014)
" ... Here we show how the oscillation frequency is shaped by single neuron resonance, electrical and chemical synapses.The presence of both gap junctions and subthreshold resonance are necessary for the emergence of oscillations. Our results are in agreement with several experimental observations such as network responses to oscillatory inputs and offer a much-needed conceptual link connecting a collection of disparate effects observed in networks."
14. Synaptic scaling balances learning in a spiking model of neocortex (Rowan & Neymotin 2013)
Learning in the brain requires complementary mechanisms: potentiation and activity-dependent homeostatic scaling. We introduce synaptic scaling to a biologically-realistic spiking model of neocortex which can learn changes in oscillatory rhythms using STDP, and show that scaling is necessary to balance both positive and negative changes in input from potentiation and atrophy. We discuss some of the issues that arise when considering synaptic scaling in such a model, and show that scaling regulates activity whilst allowing learning to remain unaltered.
15. Vibration-sensitive Honeybee interneurons (Ai et al 2017)
"Female honeybees use the “waggle dance” to communicate the location of nectar sources to their hive mates. Distance information is encoded in the duration of the waggle phase (von Frisch, 1967). During the waggle phase, the dancer produces trains of vibration pulses, which are detected by the follower bees via Johnston's organ located on the antennae. To uncover the neural mechanisms underlying the encoding of distance information in the waggle dance follower, we investigated morphology, physiology, and immunohistochemistry of interneurons arborizing in the primary auditory center of the honeybee (Apis mellifera). We identified major interneuron types, named DL-Int-1, DL-Int-2, and bilateral DL-dSEG-LP, that responded with different spiking patterns to vibration pulses applied to the antennae. Experimental and computational analyses suggest that inhibitory connection plays a role in encoding and processing the duration of vibration pulse trains in the primary auditory center of the honeybee."

Re-display model names without descriptions