Models that contain the Cell : Abstract Izhikevich neuron

("We present a model that reproduces spiking and bursting behavior of known types of cortical neurons. The model combines the biologically plausibility of Hodgkin-Huxley-type dynamics and the computational efficiency of integrate-and-fire neurons. Using this model, one can simulate tens of thousands of spiking cortical neurons in real time (1 ms resolution) using a desktop PC." (Izhikevich 2003))
Re-display model names without descriptions
    Models   Description
1.  A spiking neural network model of the Lateral Geniculate Nucleus (Sen-Bhattacharya et al 2017)
Using Izhikevich's spiking neuron models, to build a network with a biologically informed synaptic layout emulating the Lateral Geniculate Nucleus.
2.  Artificial neuron model (Izhikevich 2003, 2004, 2007)
A set of models is presented based on 2 related parameterizations to reproduce spiking and bursting behavior of multiple types of cortical neurons and thalamic neurons. These models combine the biologically plausibility of Hodgkin Huxley-type dynamics and the computational efficiency of integrate-and-fire neurons. Using these model, one can simulate tens of thousands of spiking cortical neurons in real time (1 ms resolution) using a desktop PC.
3.  CA1 pyramidal neuron (Ferguson et al. 2014)
Izhikevich-based models of CA1 pyramidal cells, with parameters constrained based on a whole hippocampus preparation. Strongly and weakly adapting models based on the experimental data have been developed. Code produces example model output. The code will also be made available on OSB.
4.  CA1 pyramidal neuron network model (Ferguson et al 2015)
From the paper: Figure 4 (1000 cell network) is reproduced by running this brian code. The raster plot and one of the excitatory cell voltage is produced.
5.  CA1 SOM+ (OLM) hippocampal interneuron (Ferguson et al. 2015)
This two-variable simple model is derived based on patch-clamp recordings from the CA1 region of a whole hippocampus preparation of SOM+ inhibitory cells. The model code will also be made available on OSB.
6.  Cortex-Basal Ganglia-Thalamus network model (Kumaravelu et al. 2016)
" ... We developed a biophysical network model comprising of the closed loop cortical-basal ganglia-thalamus circuit representing the healthy and parkinsonian rat brain. The network properties of the model were validated by comparing responses evoked in basal ganglia (BG) nuclei by cortical (CTX) stimulation to published experimental results. A key emergent property of the model was generation of low-frequency network oscillations. Consistent with their putative pathological role, low-frequency oscillations in model BG neurons were exaggerated in the parkinsonian state compared to the healthy condition. ..."
7.  Cortico-striatal plasticity in medium spiny neurons (Gurney et al 2015)
In the associated paper (Gurney et al, PLoS Biology, 2015) we presented a computational framework that addresses several issues in cortico-striatal plasticity including spike timing, reward timing, dopamine level, and dopamine receptor type. Thus, we derived a complete model of dopamine and spike-timing dependent cortico-striatal plasticity from in vitro data. We then showed this model produces the predicted activity changes necessary for learning and extinction in an operant task. Moreover, we showed the complex dependencies of cortico-striatal plasticity are not only sufficient but necessary for learning and extinction. The model was validated in a wider setting of action selection in basal ganglia, showing how it could account for behavioural data describing extinction, renewal, and reacquisition, and replicate in vitro experimental data on cortico-striatal plasticity. The code supplied here allows reproduction of the proposed process of learning in medium spiny neurons, giving the results of Figure 7 of the paper.
8.  Dynamics in random NNs with multiple neuron subtypes (Pena et al 2018, Tomov et al 2014, 2016)
"Spontaneous cortical population activity exhibits a multitude of oscillatory patterns, which often display synchrony during slow-wave sleep or under certain anesthetics and stay asynchronous during quiet wakefulness. The mechanisms behind these cortical states and transitions among them are not completely understood. Here we study spontaneous population activity patterns in random networks of spiking neurons of mixed types modeled by Izhikevich equations. Neurons are coupled by conductance-based synapses subject to synaptic noise. We localize the population activity patterns on the parameter diagram spanned by the relative inhibitory synaptic strength and the magnitude of synaptic noise. In absence of noise, networks display transient activity patterns, either oscillatory or at constant level. The effect of noise is to turn transient patterns into persistent ones: for weak noise, all activity patterns are asynchronous non-oscillatory independently of synaptic strengths; for stronger noise, patterns have oscillatory and synchrony characteristics that depend on the relative inhibitory synaptic strength. ..."
9.  Evolving simple models of diverse dynamics in hippocampal neuron types (Venkadesh et al 2018)
" ... we present an automated pipeline based on evolutionary algorithms to quantitatively reproduce features of various classes of neuronal spike patterns using the Izhikevich model. Employing experimental data from, a comprehensive knowledgebase of neuron types in the rodent hippocampus, we demonstrate that our approach reliably fit Izhikevich models to nine distinct classes of experimentally recorded spike patterns, including delayed spiking, spiking with adaptation, stuttering, and bursting. ..."
10.  Excitotoxic loss of dopaminergic cells in PD (Muddapu et al 2019)
"... A couple of the proposed mechanisms, however, show potential for the development of a novel line of PD (Parkinson's disease) therapeutics. One of these mechanisms is the peculiar metabolic vulnerability of SNc (Substantia Nigra pars compacta) cells compared to other dopaminergic clusters; the other is the SubThalamic Nucleus (STN)-induced excitotoxicity in SNc. To investigate the latter hypothesis computationally, we developed a spiking neuron network-model of SNc-STN-GPe system. In the model, prolonged stimulation of SNc cells by an overactive STN leads to an increase in ‘stress’ variable; when the stress in a SNc neuron exceeds a stress threshold, the neuron dies. The model shows that the interaction between SNc and STN involves a positive-feedback due to which, an initial loss of SNc cells that crosses a threshold causes a runaway-effect, leading to an inexorable loss of SNc cells, strongly resembling the process of neurodegeneration. The model further suggests a link between the two aforementioned mechanisms of SNc cell loss. Our simulation results show that the excitotoxic cause of SNc cell loss might initiate by weak-excitotoxicity mediated by energy deficit, followed by strong-excitotoxicity, mediated by a disinhibited STN. A variety of conventional therapies were simulated to test their efficacy in slowing down SNc cell loss. Among them, glutamate inhibition, dopamine restoration, subthalamotomy and deep brain stimulation showed superior neuroprotective-effects in the proposed model."
11.  Gap junction plasticity as a mechanism to regulate network-wide oscillations (Pernelle et al 2018)
"Oscillations of neural activity emerge when many neurons repeatedly activate together and are observed in many brain regions, particularly during sleep and attention. Their functional role is still debated, but could be associated with normal cognitive processes such as memory formation or with pathologies such as schizophrenia and autism. Powerful oscillations are also a hallmark of epileptic seizures. Therefore, we wondered what mechanism could regulate oscillations. A type of neuronal coupling, called gap junctions, has been shown to promote synchronization between inhibitory neurons. Computational models show that when gap junctions are strong, neurons synchronize together. Moreover recent investigations show that the gap junction coupling strength is not static but plastic and dependent on the firing properties of the neurons. Thus, we developed a model of gap junction plasticity in a network of inhibitory and excitatory neurons. We show that gap junction plasticity can maintain the right amount of oscillations to prevent pathologies from emerging. Finally, we show that gap junction plasticity serves an additional functional role and allows for efficient and robust information transfer."
12.  Hyperbolic model (Daneshzand et al 2017)
A modified Izhikevich neuron model to address the switching patterns of neuronal firing seen in Parkinson's Disease.
13.  Inhibitory network bistability explains increased activity prior to seizure onset (Rich et al 2020)
" ... the mechanisms predisposing an inhibitory network toward increased activity, specifically prior to ictogenesis, without a permanent change to inputs to the system remain unknown. We address this question by comparing simulated inhibitory networks containing control interneurons and networks containing hyperexcitable interneurons modeled to mimic treatment with 4-Aminopyridine (4-AP), an agent commonly used to model seizures in vivo and in vitro. Our in silico study demonstrates that model inhibitory networks with 4-AP interneurons are more prone than their control counterparts to exist in a bistable state in which asynchronously firing networks can abruptly transition into synchrony driven by a brief perturbation. This transition into synchrony brings about a corresponding increase in overall firing rate. We further show that perturbations driving this transition could arise in vivo from background excitatory synaptic activity in the cortex. Thus, we propose that bistability explains the increase in interneuron activity observed experimentally prior to seizure via a transition from incoherent to coherent dynamics. Moreover, bistability explains why inhibitory networks containing hyperexcitable interneurons are more vulnerable to this change in dynamics, and how such networks can undergo a transition without a permanent change in the drive. ..."
14.  Input strength and time-varying oscillation peak frequency (Cohen MX 2014)
The purpose of this paper is to argue that a single neural functional principle—temporal fluctuations in oscillation peak frequency (“frequency sliding”)—can be used as a common analysis approach to bridge multiple scales within neuroscience. The code provided here recreates the network models used to demonstrate changes in peak oscillation frequency as a function of static and time-varying input strength, and also shows how correlated frequency sliding can be used to identify functional connectivity between two networks.
15.  Locust olfactory network with GGN and full KC population in the mushroom body (Ray et al 2020)
We reconstructed the GGN (giant GABAergic neuron) morphology from 3D confocal image stack, and built a passive model based on the morphology to study signal attenuation across this giant neuron. In order to study the effect of feedback inhibition from this cell on odor information processing, we created a model of the olfactory network in the locust mushroom body with 50,000 KCs (Kenyon cell) reciprocally connected to this neuron. Finally, we added a model of the IG (Inhibitor of GGN) to reproduce in vivo odor responses in GGN.
16.  Mean-field systems and small scale neural networks (Ferguson et al. 2015)
We explore adaptation induced bursting as a mechanism for theta oscillations in hippocampal area CA1. To do this, we have developed a mean-field system for a network of fitted Izhikevich neurons with sparse coupling and heterogeneity. The code contained here runs the mean-field system pointwise or on a two-parameter mesh, in addition to networks of neurons that are smaller then those considered in the paper. The file README.pdf contains instructions on use. Note that the following file (peakfinder): is required to compute burst frequencies in the mean-field system and must be downloaded and placed in the same root folder as MFSIMULATOR.mat
17.  Motor system model with reinforcement learning drives virtual arm (Dura-Bernal et al 2017)
"We implemented a model of the motor system with the following components: dorsal premotor cortex (PMd), primary motor cortex (M1), spinal cord and musculoskeletal arm (Figure 1). PMd modulated M1 to select the target to reach, M1 excited the descending spinal cord neurons that drove the arm muscles, and received arm proprioceptive feedback (information about the arm position) via the ascending spinal cord neurons. The large-scale model of M1 consisted of 6,208 spiking Izhikevich model neurons [37] of four types: regular-firing and bursting pyramidal neurons, and fast-spiking and low-threshold-spiking interneurons. These were distributed across cortical layers 2/3, 5A, 5B and 6, with cell properties, proportions, locations, connectivity, weights and delays drawn primarily from mammalian experimental data [38], [39], and described in detail in previous work [29]. The network included 486,491 connections, with synapses modeling properties of four different receptors ..."
18.  Norns - Neural Network Studio (Visser & Van Gils 2014)
The Norns - Neural Network Studio is a software package for designing, simulation and analyzing networks of spiking neurons. It consists of three parts: 1. "Urd": a Matlab frontend with high-level functions for quickly defining networks 2. "Verdandi": an optimized C++ simulation environment which runs the simulation defined by Urd 3. "Skuld": an advanced Matlab graphical user interface (GUI) for visual inspection of simulated data.
19.  Parallelizing large networks in NEURON (Lytton et al. 2016)
"Large multiscale neuronal network simulations and innovative neurotechnologies are required for development of these models requires development of new simulation technologies. We describe here the current use of the NEURON simulator with MPI (message passing interface) for simulation in the domain of moderately large networks on commonly available High Performance Computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike passing paradigm and post-simulation data storage and data management approaches. We also compare three types of networks, ..."
20.  Reproducing infra-slow oscillations with dopaminergic modulation (Kobayashi et al 2017)
" ... In this paper, to reproduce ISO (Infra-Slow Oscillations) in neural networks, we show that dopaminergic modulation of STDP is essential. More specifically, we discovered a close relationship between two dopaminergic effects: modulation of the STDP function and generation of ISO. We therefore, numerically investigated the relationship in detail and proposed a possible mechanism by which ISO is generated."
21.  Robust modulation of integrate-and-fire models (Van Pottelbergh et al 2018)
"By controlling the state of neuronal populations, neuromodulators ultimately affect behavior. A key neuromodulation mechanism is the alteration of neuronal excitability via the modulation of ion channel expression. This type of neuromodulation is normally studied with conductance-based models, but those models are computationally challenging for large-scale network simulations needed in population studies. This article studies the modulation properties of the multiquadratic integrate-and-fire model, a generalization of the classical quadratic integrate-and-fire model. The model is shown to combine the computational economy of integrate-and-fire modeling and the physiological interpretability of conductance-based modeling. It is therefore a good candidate for affordable computational studies of neuromodulation in large networks."
22.  Role for short term plasticity and OLM cells in containing spread of excitation (Hummos et al 2014)
This hippocampus model was developed by matching experimental data, including neuronal behavior, synaptic current dynamics, network spatial connectivity patterns, and short-term synaptic plasticity. Furthermore, it was constrained to perform pattern completion and separation under the effects of acetylcholine. The model was then used to investigate the role of short-term synaptic depression at the recurrent synapses in CA3, and inhibition by basket cell (BC) interneurons and oriens lacunosum-moleculare (OLM) interneurons in containing the unstable spread of excitatory activity in the network.
23.  Supervised learning in spiking neural networks with FORCE training (Nicola & Clopath 2017)
The code contained in the zip file runs FORCE training for various examples from the paper: Figure 2 (Oscillators and Chaotic Attractor) Figure 3 (Ode to Joy) Figure 4 (Song Bird Example) Figure 5 (Movie Example) Supplementary Figures 10-12 (Classifier) Supplementary Ode to Joy Example Supplementary Figure 2 (Oscillator Panel) Supplementary Figure 17 (Long Ode to Joy) Note that due to file size limitations, the supervisors for Figures 4/5 are not included. See Nicola, W., & Clopath, C. (2016). Supervised Learning in Spiking Neural Networks with FORCE Training. arXiv preprint arXiv:1609.02545. for further details.

Re-display model names without descriptions