Models that contain the Cell : Abstract integrate-and-fire leaky neuron

(A simple electrical model of a neuron first introduced in (Lapicque 1907))
Re-display model names without descriptions
    Models   Description
1.  A full-scale cortical microcircuit spiking network model (Shimoura et al 2018)
Reimplementation in BRIAN 2 simulator of a full-scale cortical microcircuit containing two cell types (excitatory and inhibitory) distributed in four layers, and represents the cortical network below a surface of 1 mm² (Potjans & Diesmann, 2014).
2.  A spiking neural network model of model-free reinforcement learning (Nakano et al 2015)
"Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. ... In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL (partially observable reinforcement learning) problems with high-dimensional observations. ... The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach. "
3.  A spiking NN for amplification of feature-selectivity with specific connectivity (Sadeh et al 2015)
The model simulates large-scale inhibition-dominated spiking networks with different degrees of recurrent specific connectivity. It shows how feature-specific connectivity leads to a linear amplification of feedforward tuning, as reported in recent electrophysiological single-neuron recordings in rodent neocortex. Moreover, feature-specific connectivity leads to the emergence of feature-selective reverberating activity, and entails pattern completion in network responses.
4.  A state-space model to quantify common input to motor neurons (Feeney et al 2017)
"... We introduce a space-state model in which the discharge activity of motor neurons is modeled as inhomogeneous Poisson processes and propose a method to quantify an abstract latent trajectory that represents the common input received by motor neurons. The approach also approximates the variation in synaptic noise in the common input signal. The model is validated with four data sets: a simulation of 120 motor units, a pair of integrate-and-fire neurons with a Renshaw cell providing inhibitory feedback, the discharge activity of 10 integrate-and-fire neurons, and the discharge times of concurrently active motor units during an isometric voluntary contraction. The simulations revealed that a latent state-space model is able to quantify the trajectory and variability of the common input signal across all four conditions. When compared with the cumulative spike train method of characterizing common input, the state-space approach was more sensitive to the details of the common input current and was less influenced by the duration of the signal. The state-space approach appears to be capable of detecting rather modest changes in common input signals across conditions."
5.  Active dendritic integration in robust and precise grid cell firing (Schmidt-Hieber et al 2017)
"... Whether active dendrites contribute to the generation of the dual temporal and rate codes characteristic of grid cell output is unknown. We show that dendrites of medial entorhinal cortex neurons are highly excitable and exhibit a supralinear input–output function in vitro, while in vivo recordings reveal membrane potential signatures consistent with recruitment of active dendritic conductances. By incorporating these nonlinear dynamics into grid cell models, we show that they can sharpen the precision of the temporal code and enhance the robustness of the rate code, thereby supporting a stable, accurate representation of space under varying environmental conditions. Our results suggest that active dendrites may therefore constitute a key cellular mechanism for ensuring reliable spatial navigation."
6.  Biophysical model for field potentials of networks of I&F neurons (beim Graben & Serafim 2013)
"... Starting from a reduced three-compartment model of a single pyramidal neuron, we derive an observation model for dendritic dipole currents in extracellular space and thereby for the dendritic field potential (DFP) that contributes to the local field potential (LFP) of a neural population. ... Our reduced three-compartment scheme allows to derive networks of leaky integrate-and-fire (LIF) models, which facilitates comparison with existing neural network and observation models. ..."
7.  Code to calc. spike-trig. ave (STA) conduct. from Vm (Pospischil et al. 2007, Rudolph et al. 2007)
PYTHON code to calculate spike-triggered average (STA) conductances from intracellular recordings, according to the method published by Pospischil et al., J Neurophysiol, 2007. The method consists of a maximum likelihood estimate of the conductance STA, from the voltage STA (which is calculated from the data). The method was tested using models and dynamic-clamp experiments; for details, see the original publication (Pospischil et al., 2007). The first application of this method to experimental data was from intracellular recordings in awake cat cerebral cortex (Rudolph et al., 2007).
8.  Dentate Gyrus model including Granule cells with dendritic compartments (Chavlis et al 2017)
Here we investigate the role of dentate granule cell dendrites in pattern separation. The model consists of point neurons (Integrate and fire) and in principal neurons, the granule cells, we have incorporated various number of dendrites.
9.  Diffusive homeostasis in a spiking network model (Sweeney et al. 2015)
In this paper we propose a new mechanism, diffusive homeostasis, in which neural excitability is modulated by nitric oxide, a gas which can flow freely across cell membranes. Our model simulates the activity-dependent synthesis and diffusion of nitric oxide in a recurrent network model of integrate-and-fire neurons. The concentration of nitric oxide is then used as homeostatic readout which modulates the firing threshold of each neuron.
10.  Double boundary value problem (A. Bose and J.E. Rubin, 2015)
For two neurons coupled with mutual inhibition, we investigate the strategies that each neuron should utilize in order to maximize the number of spikes it can fire (or equivalently the amount of time it is active) before the other neuron takes over. We derive a one-dimensional map whose fixed points correspond to periodic anti-phase bursting solutions. The model here solves a novel double boundary value problem that can be used to obtain the graph of this map. Read More:
11.  Dynamical patterns underlying response properties of cortical circuits (Keane et al 2018)
"Recent experimental studies show cortical circuit responses to external stimuli display varied dynamical properties. These include stimulus strength-dependent population response patterns, a shift from synchronous to asynchronous states and a decline in neural variability. To elucidate the mechanisms underlying these response properties and explore how they are mechanistically related, we develop a neural circuit model that incorporates two essential features widely observed in the cerebral cortex. The first feature is a balance between excitatory and inhibitory inputs to individual neurons; the second feature is distance-dependent connectivity. We show that applying a weak external stimulus to the model evokes a wave pattern propagating along lateral connections, but a strong external stimulus triggers a localized pattern; these stimulus strength-dependent population response patterns are quantitatively comparable with those measured in experimental studies. ..."
12.  Effects of the membrane AHP on the Lateral Superior Olive (LSO) (Zhou & Colburn 2010)
This simulation study investigated how membrane afterhyperpolarization (AHP) influences spiking activity of neurons in the Lateral Superior Olive (LSO). The model incorporates a general integrate-and-fire spiking mechanism with a first-order adaptation channel. Simulations focus on differentiating the effects of GAHP, tauAHP, and input strength on (1) spike interval statistics, such as negative serial correlation and chopper onset, and (2) neural sensitivity to interaural level difference (ILD) of LSO neurons. The model simulated electrophysiological data collected in cat LSO (Tsuchitani and Johnson, 1985).
13.  Excitatory and inhibitory population activity (Bittner et al 2017) (Litwin-Kumar & Doiron 2017)
"Many studies use population analysis approaches, such as dimensionality reduction, to characterize the activity of large groups of neurons. To date, these methods have treated each neuron equally, without taking into account whether neurons are excitatory or inhibitory. We studied population activity structure as a function of neuron type by applying factor analysis to spontaneous activity from spiking networks with balanced excitation and inhibition. Throughout the study, we characterized population activity structure by measuring its dimensionality and the percentage of overall activity variance that is shared among neurons. First, by sampling only excitatory or only inhibitory neurons, we found that the activity structures of these two populations in balanced networks are measurably different. We also found that the population activity structure is dependent on the ratio of excitatory to inhibitory neurons sampled. Finally we classified neurons from extracellular recordings in the primary visual cortex of anesthetized macaques as putative excitatory or inhibitory using waveform classification, and found similarities with the neuron type-specific population activity structure of a balanced network with excitatory clustering. These results imply that knowledge of neuron type is important, and allows for stronger statistical tests, when interpreting population activity structure."
14.  Feedforward network undergoing Up-state-mediated plasticity (Gonzalez-Rueda et al. 2018)
Using whole-cell recordings and optogenetic stimulation of presynaptic input in anaesthetized mice, we show that synaptic plasticity rules are gated by cortical dynamics. Up states are biased towards depression such that presynaptic stimulation alone leads to synaptic depression, while connections contributing to postsynaptic spiking are protected against this synaptic weakening. We find that this novel activity-dependent and input-specific downscaling mechanism has two important computational advantages: 1) improved signal-to-noise ratio, and 2) preservation of previously stored information. Thus, these synaptic plasticity rules provide an attractive mechanism for SWS-related synaptic downscaling and circuit refinement. We simulate a feedforward network of neurons undergoing Up-state-mediated plasticity. Under this plasticity rule, presynaptic spikes alone lead to synaptic depression, whereas those followed by postsynaptic spikes within 10 ms are not changed.
15.  Functional balanced networks with synaptic plasticity (Sadeh et al, 2015)
The model investigates the impact of learning on functional sensory networks. It uses large-scale recurrent networks of excitatory and inhibitory spiking neurons equipped with synaptic plasticity. It explains enhancement of orientation selectivity and emergence of feature-specific connectivity in visual cortex of rodents during development, as reported in experiments.
16.  Gap junction plasticity as a mechanism to regulate network-wide oscillations (Pernelle et al 2018)
17.  Generalized Carnevale-Hines algorithm (van Elburg and van Ooyen 2009)
Demo illustrating the behaviour of the integrate-and-fire model in the parameter regime relevant for the generalized event-based Carnevale-Hines integration scheme. The demo includes the improved implementation of the IntFire4 mechanism.
18.  Grid cell oscillatory interference with noisy network oscillators (Zilli and Hasselmo 2010)
To examine whether an oscillatory interference model of grid cell activity could work if the oscillators were noisy neurons, we implemented these simulations. Here the oscillators are networks (either synaptically- or gap-junction--coupled) of one or more noisy neurons (either Izhikevich's simple model or a Hodgkin-Huxley--type biophysical model) which drive a postsynaptic cell (which may be integrate-and-fire, resonate-and-fire, or the simple model) which should fire spatially as a grid cell if the simulation is successful.
19.  Grid cell spatial firing models (Zilli 2012)
This package contains MATLAB implementations of most models (published from 2005 to 2011) of the hexagonal firing field arrangement of grid cells.
20.  Hierarchical network model of perceptual decision making (Wimmer et al 2015)
Neuronal variability in sensory cortex predicts perceptual decisions. To investigate the interaction of bottom-up and top-down mechanisms during the decision process, we developed a hierarchical network model. The network consists of two circuits composed of leaky integrate-and-fire neurons: an integration circuit (e.g. LIP, FEF) and a sensory circuit (MT), recurrently coupled via bottom-up feedforward connections and top-down feedback connections. The integration circuit accumulates sensory evidence and produces a binary categorization due to winner-take-all competition between two decision-encoding populations (X.J. Wang, Neuron, 2002). The sensory circuit is a balanced randomly connected EI-network, that contains neural populations selective to opposite directions of motion. We have used this model to simulate a standard two-alternative forced-choice motion discrimination task.
21.  Hippocampal spiking model for context dependent behavior (Raudies & Hasselmo 2014)
Our model simulates the effect of context dependent behavior using discrete inputs to drive spiking activity representing place and item followed sequentially by a discrete representation of the motor actions involving a response to an item (digging for food) or the movement to a different item (movement to a different pot for food). This simple network was able to consistently learn the context-dependent responses.
22.  I&F recurrent networks with current- or conductance-based synapses (Cavallari et al. 2014)
Recurrent networks of two populations (excitatory and inhibitory) of randomly connected Leaky Integrate-and-Fire (LIF) neurons with either current- or conductance-based synapses from the paper S. Cavallari, S. Panzeri and A. Mazzoni (2014)
23.  Inhibitory cells enable sparse coding in V1 model (King et al. 2013)
" ... Here we show that adding a separate population of inhibitory neurons to a spiking model of V1 provides conformance to Dale’s Law, proposes a computational role for at least one class of interneurons, and accounts for certain observed physiological properties in V1. ... "
24.  Inverse stochastic resonance of cerebellar Purkinje cell (Buchin et al. 2016)
This code shows the simulations of the adaptive exponential integrate-and-fire model ( at different stimulus conditions. The parameters of the model were tuned to the Purkinje cell of cerebellum to reproduce the inhibiion of these cells by noisy current injections. Similar experimental protocols were also applied to the detailed biophysical model of Purkinje cells, de Shutter & Bower (1994) model. The repository also includes the XPPaut version of the model with the corresponding bifurcation analysis.
25.  Leaky integrate-and-fire model of spike frequency adaptation in the LGMD (Gabbiani and Krapp 2006)
This will reproduce Figure 9 of Gabbiani and Krapp (2006) J Neurophysiol 96:2951-2962. The figure simply shows that a leaky-integrate-and-fire model cannot reproduce spike frequency adaptation as it is seen experimentally in the LGMD neuron.
26.  MDD: the role of glutamate dysfunction on Cingulo-Frontal NN dynamics (Ramirez-Mahaluf et al 2017)
" ...Currently, no mechanistic framework describes how network dynamics, glutamate, and serotonin interact to explain MDD symptoms and treatments. Here, we built a biophysical computational model of 2 areas (vACC and dlPFC) that can switch between emotional and cognitive processing. (Major Depression Disease) MDD networks were simulated by slowing glutamate decay in vACC and demonstrated sustained vACC activation. ..."
27.  Modelling the effects of short and random proto-neural elongations (de Wiljes et al 2017)
"To understand how neurons and nervous systems first evolved, we need an account of the origins of neural elongations: why did neural elongations (axons and dendrites) first originate, such that they could become the central component of both neurons and nervous systems? Two contrasting conceptual accounts provide different answers to this question. Braitenberg's vehicles provide the iconic illustration of the dominant input-output (IO) view. Here, the basic role of neural elongations is to connect sensors to effectors, both situated at different positions within the body. For this function, neural elongations are thought of as comparatively long and specific connections, which require an articulated body involving substantial developmental processes to build. Internal coordination (IC) models stress a different function for early nervous systems. Here, the coordination of activity across extended parts of a multicellular body is held central, in particular, for the contractions of (muscle) tissue. An IC perspective allows the hypothesis that the earliest proto-neural elongations could have been functional even when they were initially simple, short and random connections, as long as they enhanced the patterning of contractile activity across a multicellular surface. The present computational study provides a proof of concept that such short and random neural elongations can play this role. ..."
28.  Models for cortical UP-DOWN states in a bistable inhibitory-stabilized network (Jercog et al 2017)
In the idling brain, neuronal circuits transition between periods of sustained firing (UP state) and quiescence (DOWN state), a pattern the mechanisms of which remain unclear. We analyzed spontaneous cortical population activity from anesthetized rats and found that UP and DOWN durations were highly variable and that population rates showed no significant decay during UP periods. We built a network rate model with excitatory (E) and inhibitory (I) populations exhibiting a novel bistable regime between a quiescent and an inhibition-stabilized state of arbitrarily low rate, where fluctuations triggered state transitions. In addition, we implemented these mechanisms in a more biophysically realistic spiking network, where DOWN-to-UP transitions are caused by synchronous high-amplitude events impinging onto the network.
29.  Multi-timescale adaptive threshold model (Kobayashi et al 2009)
" ... In this study, we devised a simple, fast computational model that can be tailored to any cortical neuron not only for reproducing but also for predicting a variety of spike responses to greatly fluctuating currents. The key features of this model are a multi-timescale adaptive threshold predictor and a nonresetting leaky integrator. This model is capable of reproducing a rich variety of neuronal spike responses, including regular spiking, intrinsic bursting, fast spiking, and chattering, by adjusting only three adaptive threshold parameters. ..."
30.  Multi-timescale adaptive threshold model (Kobayashi et al 2009) (NEURON)
" ... In this study, we devised a simple, fast computational model that can be tailored to any cortical neuron not only for reproducing but also for predicting a variety of spike responses to greatly fluctuating currents. The key features of this model are a multi-timescale adaptive threshold predictor and a nonresetting leaky integrator. This model is capable of reproducing a rich variety of neuronal spike responses, including regular spiking, intrinsic bursting, fast spiking, and chattering, by adjusting only three adaptive threshold parameters. ..."
31.  Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007)
This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005).
32.  Neural transformations on spike timing information (Tripp and Eliasmith 2007)
" ... Here we employ computational methods to show that an ensemble of neurons firing at a constant mean rate can induce arbitrarily chosen temporal current patterns in postsynaptic cells. ..."
33.  Neuronify: An Educational Simulator for Neural Circuits (Dragly et al 2017)
"Neuronify, a new educational software application (app) providing an interactive way of learning about neural networks, is described. Neuronify allows students with no programming experience to easily build and explore networks in a plug-and-play manner picking network elements (neurons, stimulators, recording devices) from a menu. The app is based on the commonly used integrate-and-fire type model neuron and has adjustable neuronal and synaptic parameters. ..."
34.  Norns - Neural Network Studio (Visser & Van Gils 2014)
The Norns - Neural Network Studio is a software package for designing, simulation and analyzing networks of spiking neurons. It consists of three parts: 1. "Urd": a Matlab frontend with high-level functions for quickly defining networks 2. "Verdandi": an optimized C++ simulation environment which runs the simulation defined by Urd 3. "Skuld": an advanced Matlab graphical user interface (GUI) for visual inspection of simulated data.
35.  Olfactory Bulb mitral-granule network generates beta oscillations (Osinski & Kay 2016)
This model of the dendrodendritic mitral-granule synaptic network generates gamma and beta oscillations as a function of the granule cell excitability, which is represented by the granule cell resting membrane potential.
36.  Optimal Localist and Distributed Coding Through STDP (Masquelier & Kheradpisheh 2018)
We show how a LIF neuron equipped with STDP can become optimally selective, in an unsupervised manner, to one or several repeating spike patterns, even when those patterns are hidden in Poisson spike trains.
37.  Optimal spatiotemporal spike pattern detection by STDP (Masquelier 2017)
We simulate a LIF neuron equipped with STDP. A pattern repeats in its inputs. The LIF progressively becomes selective to the repeating pattern, in an optimal manner.
38.  Orientation selectivity in inhibition-dominated recurrent networks (Sadeh and Rotter, 2015)
Emergence of contrast-invariant orientation selectivity in large-scale networks of excitatory and inhibitory neurons using integrate-and-fire neuron models.
39.  Oscillating neurons in the cochlear nucleus (Bahmer Langner 2006a, b, and 2007)
"Based on the physiological and anatomical data, we propose a model consisting of a minimum network of two choppers that are interconnected with a synaptic delay of 0.4 ms (Bahmer and Langner 2006a) . Such minimum delays have been found in different systems and in various animals (e.g. Hackett, Jackson, and Rubel 1982; Borst, Helmchen, and Sakmann 1995). The choppers receive input from both the auditory nerve and an onset neuron. This model can reproduce the mean, standard deviation, and coefficient of variation of the ISI and the dynamic features of AM coding of choppers."
40.  Oscillations, phase-of-firing coding and STDP: an efficient learning scheme (Masquelier et al. 2009)
The model demonstrates how a common oscillatory drive for a group of neurons formats and reliabilizes their spike times - through an activation-to-phase conversion - so that repeating activation patterns can be easily detected and learned by a downstream neuron equipped with STDP, and then recognized in just one oscillation cycle.
41.  Perfect Integrate and fire with noisy adaptation or fractional noise (Richard et al 2018)
"Here we show that a purely Markovian integrate-and-fire (IF) model, with a noisy slow adaptation term, can generate interspike intervals (ISIs) that appear as having Long-range dependency (LRD). However a proper analysis shows that this is not the case asymptotically. For comparison, we also consider a new model of individual IF neuron with fractional (non-Markovian) noise. The correlations of its spike trains are studied and proven to have LRD, unlike classical IF models."
42.  Population models of temporal differentiation (Tripp and Eliasmith 2010)
"Temporal derivatives are computed by a wide variety of neural circuits, but the problem of performing this computation accurately has received little theoretical study. Here we systematically compare the performance of diverse networks that calculate derivatives using cell-intrinsic adaptation and synaptic depression dynamics, feedforward network dynamics, and recurrent network dynamics. Examples of each type of network are compared by quantifying the errors they introduce into the calculation and their rejection of high-frequency input noise. ..."
43.  PyRhO: A multiscale optogenetics simulation platform (Evans et al 2016)
"... we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. ..."
44.  Relative spike time coding and STDP-based orientation selectivity in V1 (Masquelier 2012)
Phenomenological spiking model of the cat early visual system. We show how natural vision can drive spike time correlations on sufficiently fast time scales to lead to the acquisition of orientation-selective V1 neurons through STDP. This is possible without reference times such as stimulus onsets, or saccade landing times. But even when such reference times are available, we demonstrate that the relative spike times encode the images more robustly than the absolute ones.
45.  SHOT-CA3, RO-CA1 Training, & Simulation CODE in models of hippocampal replay (Nicola & Clopath 2019)
In this code, we model the interaction between the medial septum and hippocampus as a FORCE trained, dual oscillator model. One oscillator corresponds to the medial septum and serves as an input, while a FORCE trained network of LIF neurons acts as a model of the CA3. We refer to this entire model as the Septal Hippocampal Oscillator Theta (or SHOT) network. The code contained in this upload allows a user to train a SHOT network, train a population of reversion interneurons, and simulate the SHOT-CA3 and RO-CA1 networks after training. The code scripts are labeled to correspond to the figure from the manuscript.
46.  Spike burst-pause dynamics of Purkinje cells regulate sensorimotor adaptation (Luque et al 2019)
"Cerebellar Purkinje cells mediate accurate eye movement coordination. However, it remains unclear how oculomotor adaptation depends on the interplay between the characteristic Purkinje cell response patterns, namely tonic, bursting, and spike pauses. Here, a spiking cerebellar model assesses the role of Purkinje cell firing patterns in vestibular ocular reflex (VOR) adaptation. The model captures the cerebellar microcircuit properties and it incorporates spike-based synaptic plasticity at multiple cerebellar sites. ..."
47.  Spiking neuron model of the basal ganglia (Humphries et al 2006)
A spiking neuron model of the basal ganglia (BG) circuit (striatum, STN, GP, SNr). Includes: parallel anatomical channels; tonic dopamine; dopamine receptors in striatum, STN, and GP; burst-firing in STN; GABAa, AMPA, and NMDA currents; effects of synaptic location. Model demonstrates selection and switching of input signals. Replicates experimental data on changes in slow-wave (<1 Hz) and gamma-band oscillations within BG nuclei following lesions and pharmacological manipulations.
48.  STDP allows fast rate-modulated coding with Poisson-like spike trains (Gilson et al. 2011)
The model demonstrates that a neuron equipped with STDP robustly detects repeating rate patterns among its afferents, from which the spikes are generated on the fly using inhomogenous Poisson sampling, provided those rates have narrow temporal peaks (10-20ms) - a condition met by many experimental Post-Stimulus Time Histograms (PSTH).
49.  Structure-dynamics relationships in bursting neuronal networks revealed (Mäki-Marttunen et al. 2013)
This entry includes tools for generating and analyzing network structure, and for running the neuronal network simulations on them.
50.  Supervised learning in spiking neural networks with FORCE training (Nicola & Clopath 2017)
The code contained in the zip file runs FORCE training for various examples from the paper: Figure 2 (Oscillators and Chaotic Attractor) Figure 3 (Ode to Joy) Figure 4 (Song Bird Example) Figure 5 (Movie Example) Supplementary Figures 10-12 (Classifier) Supplementary Ode to Joy Example Supplementary Figure 2 (Oscillator Panel) Supplementary Figure 17 (Long Ode to Joy) Note that due to file size limitations, the supervisors for Figures 4/5 are not included. See Nicola, W., & Clopath, C. (2016). Supervised Learning in Spiking Neural Networks with FORCE Training. arXiv preprint arXiv:1609.02545. for further details.
51.  Theory and simulation of integrate-and-fire neurons driven by shot noise (Droste & Lindner 2017)
This archive contains source code for the paper "Exact analytical results for integrate-and-fire neurons driven by excitatory shot noise" by Droste and Lindner, 2017. Specifically, it contains a Python implementation of the analytical formulas derived in that paper (allowing to calculate firing rate, CV and stationary voltage distribution of general integrate-and-fire neurons driven by excitatory shot noise, as well as power spectrum and rate-response of leaky integrate-and-fire neurons with such input) and C++ code implementing a Monte-Carlo simulation to estimate these quantities. A sample Jupyter notebook to play around with the analytics is included, as are scripts to reproduce the figures from the paper.
52.  Universal feature of developing networks (Tabak et al 2010)
"Spontaneous episodic activity is a fundamental mode of operation of developing networks. Surprisingly, the duration of an episode of activity correlates with the length of the silent interval that precedes it, but not with the interval that follows. ... We thus developed simple models incorporating excitatory coupling between heterogeneous neurons and activity-dependent synaptic depression. These models robustly generated episodic activity with the correct correlation pattern. The correlation pattern resulted from episodes being triggered at random levels of recovery from depression while they terminated around the same level of depression. To explain this fundamental difference between episode onset and termination, we used a mean field model, where only average activity and average level of recovery from synaptic depression are considered. ... This work further shows that networks with widely different architectures, different cell types, and different functions all operate according to the same general mechanism early in their development."

Re-display model names without descriptions