Circuits that contain the Modeling Application : C or C++ program (Home Page)

(The model is written in the C or C++ language.)
Re-display model names without descriptions
    Models   Description
1. A computational model of oxytocin modulation of olfactory recognition memory (Linster & Kelsch 2019)
Model of olfactory bulb (OB) and anterior olfactory nucleus (AON) pyramidal cells. Includes olfactory sensory neurons, mitral cells, periglomerular, external tufted and granule interneurons and pyramidal cells. Can be built to include a feedback loop between OB and AON. Output consists of voltage and spikes over time in all neurons. Model can be stimulated with simulated odorants. The code submitted here has served for a number of modeling explorations of olfactory bulb and cortex. The model architecture is defined in "bulb.dat" with synapses defined in "channels.dat". The main function to run the model can be found in "neuron.c". Model architecture is constructed in "set.c" from types defined in "sim.c". A make file to create an executable is located in "neuron.mak".
2. A computational model of systems memory consolidation and reconsolidation (Helfer & Shultz 2019)
A neural-network framework for modeling systems memory consolidation and reconsolidation.
3. A detailed data-driven network model of prefrontal cortex (Hass et al 2016)
Data-based PFC-like circuit with layer 2/3 and 5, synaptic clustering, four types of interneurons and cell-type specific short-term synaptic plasticity; neuron parameters fitted to in vitro data, all other parameters constrained by experimental literature. Reproduces key features of in vivo resting state activity without specific tuning.
4. A dynamical model of the basal ganglia (Leblois et al 2006)
We propose a new model for the function and dysfunction of the basal ganglia (BG). The basal ganglia are a set of cerebral structures involved in motor control which dysfunction causes high-incidence pathologies such as Parkinson's disease (PD). Their precise motor functions remain unknown. The classical model of the BG that allowed for the discovery of new treatments for PD seems today outdated in several respects. Based on experimental observations, our model proposes a simple dynamical framework for the understanding of how BG may select motor programs to be executed. Moreover, we explain how this ability is lost and how tremor-related oscillations in neuronal activity may emerge in PD.
5. A model of antennal lobe of bee (Chen JY et al. 2015)
" ... Here we use calcium imaging to reveal how responses across antennal lobe projection neurons change after association of an input odor with appetitive reinforcement. After appetitive conditioning to 1-hexanol, the representation of an odor mixture containing 1-hexanol becomes more similar to this odor and less similar to the background odor acetophenone. We then apply computational modeling to investigate how changes in synaptic connectivity can account for the observed plasticity. Our study suggests that experience-dependent modulation of inhibitory interactions in the antennal lobe aids perception of salient odor components mixed with behaviorally irrelevant background odors."
6. A Moth MGC Model-A HH network with quantitative rate reduction (Buckley & Nowotny 2011)
We provide the model used in Buckley & Nowotny (2011). It consists of a network of Hodgkin Huxley neurons coupled by slow GABA_B synapses which is run alongside a quantitative reduction described in the associated paper.
7. A NN with synaptic depression for testing the effects of connectivity on dynamics (Jacob et al 2019)
Here we used a 10,000 neuron model. The neurons are a mixture of excitatory and inhibitory integrate-and-fire neurons connected with synapses that exhibit synaptic depression. Three different connectivity paradigms were tested to look for spontaneous transition between interictal spiking and seizure: uniform, small-world network, and scale-free. All three model types are included here.
8. A unified thalamic model of multiple distinct oscillations (Li, Henriquez and Fröhlich 2017)
We present a unified model of the thalamus that is capable of independently generating multiple distinct oscillations (delta, spindle, alpha and gamma oscillations) under different levels of acetylcholine (ACh) and norepinephrine (NE) modulation corresponding to different physiological conditions (deep sleep, light sleep, relaxed wakefulness and attention). The model also shows that entrainment of thalamic oscillations is state-dependent.
9. An oscillatory neural model of multiple object tracking (Kazanovich and Borisyuk 2006)
An oscillatory neural network model of multiple object tracking is described. The model works with a set of identical visual objects moving around the screen. At the initial stage, the model selects into the focus of attention a subset of objects initially marked as targets. Other objects are used as distractors. The model aims to preserve the initial separation between targets and distractors while objects are moving. This is achieved by a proper interplay of synchronizing and desynchronizing interactions in a multilayer network, where each layer is responsible for tracking a single target. The results of the model simulation are presented and compared with experimental data. In agreement with experimental evidence, simulations with a larger number of targets have shown higher error rates. Also, the functioning of the model in the case of temporarily overlapping objects is presented.
10. Basis for temporal filters in the cerebellar granular layer (Roessert et al. 2015)
This contains the models, functions and resulting data as used in: Roessert C, Dean P, Porrill J. At the Edge of Chaos: How Cerebellar Granular Layer Network Dynamics Can Provide the Basis for Temporal Filters. It is based on code used for Yamazaki T, Tanaka S (2005) Neural modeling of an internal clock. Neural Comput 17:1032-58
11. Biologically Constrained Basal Ganglia model (BCBG model) (Lienard, Girard 2014)
We studied the physiology and function of the basal ganglia through the design of mean-field models of the whole basal ganglia. The parameterizations are optimized with multi-objective evolutionary algorithm to respect best a collection of numerous anatomical data and electrophysiological data. The main outcomes of our study are: • The strength of the GPe to GPi/SNr connection does not support opposed activities in the GPe and GPi/SNr. • STN and MSN target more the GPe than the GPi/SNr. • Selection arises from the structure of the basal ganglia, without properly segregated direct and indirect pathways and without specific inputs from pyramidal tract neurons of the cortex. Selection is enhanced when the projection from GPe to GPi/SNr has a diffuse pattern.
12. Cancelling redundant input in ELL pyramidal cells (Bol et al. 2011)
The paper investigates the property of the electrosensory lateral line lobe (ELL) of the brain of weakly electric fish to cancel predictable stimuli. Electroreceptors on the skin encode all signals in their firing activity, but superficial pyramidal (SP) cells in the ELL that receive this feedforward input do not respond to constant sinusoidal signals. This cancellation putatively occurs using a network of feedback delay lines and burst-induced synaptic plasticity between the delay lines and the SP cell that learns to cancel the redundant input. Biologically, the delay lines are parallel fibres from cerebellar-like granule cells in the eminentia granularis posterior. A model of this network (e.g. electroreceptors, SP cells, delay lines and burst-induced plasticity) was constructed to test whether the current knowledge of how the network operates is sufficient to cancel redundant stimuli.
13. Cerebellar gain and timing control model (Yamazaki & Tanaka 2007)(Yamazaki & Nagao 2012)
This paper proposes a hypothetical computational mechanism for unified gain and timing control in the cerebellum. The hypothesis is justified by computer simulations of a large-scale spiking network model of the cerebellum.
14. Cerebellar memory consolidation model (Yamazaki et al. 2015)
"Long-term depression (LTD) at parallel fiber-Purkinje cell (PF-PC) synapses is thought to underlie memory formation in cerebellar motor learning. Recent experimental results, however, suggest that multiple plasticity mechanisms in the cerebellar cortex and cerebellar/vestibular nuclei participate in memory formation. To examine this possibility, we formulated a simple model of the cerebellum with a minimal number of components based on its known anatomy and physiology, implementing both LTD and long-term potentiation (LTP) at PF-PC synapses and mossy fiber-vestibular nuclear neuron (MF-VN) synapses. With this model, we conducted a simulation study of the gain adaptation of optokinetic response (OKR) eye movement. Our model reproduced several important aspects of previously reported experimental results in wild-type and cerebellum-related gene-manipulated mice. ..."
15. Cerebellar Model for the Optokinetic Response (Kim and Lim 2021)
We consider a cerebellar spiking neural network for the optokinetic response (OKR). Individual granule (GR) cells exhibit diverse spiking patterns which are in-phase, anti-phase, or complex out-of-phase with respect to their population-averaged firing activity. Then, these diversely-recoded signals via parallel fibers (PFs) from GR cells are effectively depressed by the error-teaching signals via climbing fibers from the inferior olive which are also in-phase ones. Synaptic weights at in-phase PF-Purkinje cell (PC) synapses of active GR cells are strongly depressed via strong long-term depression (LTD), while those at anti-phase and complex out-of-phase PF-PC synapses are weakly depressed through weak LTD. This kind of ‘‘effective’’ depression at the PF-PC synapses causes a big modulation in firings of PCs, which then exert effective inhibitory coordination on the vestibular nucleus (VN) neuron (which evokes OKR). For the firing of the VN neuron, the learning gain degree, corresponding to the modulation gain ratio, increases with increasing the learning cycle, and it saturates.
16. Coding of stimulus frequency by latency in thalamic networks (Golomb et al 2005)
The paper presents models of the rat vibrissa processing system including the posterior medial (POm) thalamus, ventroposterior medial (VPm) thalamus, and GABAB- mediated feedback inhibition from the reticular thalamic (Rt) nucleus. A clear match between the experimentally measured spike-rates and the numerically calculated rates for the full model occurs when VPm thalamus receives stronger brainstem input and weaker GABAB-mediated inhibition than POm thalamus.
17. Competition model of pheromone ratio detection (Zavada et al. 2011)
For some closely related sympatric moth species, recognizing a specific pheromone component concentration ratio is essential for mating success. We propose and test a minimalist competition-based feed-forward neuronal model capable of detecting a certain ratio of pheromone components independently of overall concentration. This model represents an elementary recognition unit for binary mixtures which we propose is entirely contained in the macroglomerular complex (MGC) of the male moth. A set of such units, along with projection neurons (PNs), can provide the input to higher brain centres. We found that (1) accuracy is mainly achieved by maintaining a certain ratio of connection strengths between olfactory receptor neurons (ORN) and local neurons (LN), much less by properties of the interconnections between the competing LNs proper. (2) successful ratio recognition is achieved using latency-to-first-spike in the LN populations which. (3) longer durations of the competition process between LNs did not result in higher recognition accuracy.
18. Conductance-based model of Layer-4 in the barrel cortex (Argaman et Golomb 2017)
Layer 4 in the mouse barrel cortex includes hundreds of inhibitory PV neurons and thousands of excitatory neurons. Despite this fact, its dynamical state is similar to a balanced state of large neuronal circuits.
19. Connection-set Algebra (CSA) for the representation of connectivity in NN models (Djurfeldt 2012)
"The connection-set algebra (CSA) is a novel and general formalism for the description of connectivity in neuronal network models, from small-scale to large-scale structure. ... The expressiveness of CSA makes prototyping of network structure easy. A C++ version of the algebra has been implemented and used in a large-scale neuronal network simulation (Djurfeldt et al., IBM J Res Dev 52(1/2):31–42, 2008b) and an implementation in Python has been publicly released."
20. COREM: configurable retina simulator (Martínez-Cañada et al., 2016)
COREM is a configurable simulator for retina modeling that has been implemented within the framework of the Human Brain Project (HBP). The software platform can be interfaced with neural simulators (e.g., NEST) to connect with models of higher visual areas and with the Neurorobotics Platform of the HBP. The code is implemented in C++ and computations of spatiotemporal equations are optimized by means of recursive filtering techniques and multithreading. Most retina simulators are more focused on fitting specific retina functions. By contrast, the versatility of COREM allows the configuration of different retina models using a set of basic retina computational primitives. We implemented a series of retina models by combining these primitives to characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The code has been extensively tested in Linux. The software can be also adapted to Mac OS. Installation instructions as well as the user manual can be found in the Github repository:
21. Cortex learning models (Weber at al. 2006, Weber and Triesch, 2006, Weber and Wermter 2006/7)
A simulator and the configuration files for three publications are provided. First, "A hybrid generative and predictive model of the motor cortex" (Weber at al. 2006) which uses reinforcement learning to set up a toy action scheme, then uses unsupervised learning to "copy" the learnt action, and an attractor network to predict the hidden code of the unsupervised network. Second, "A Self-Organizing Map of Sigma-Pi Units" (Weber and Wermter 2006/7) learns frame of reference transformations on population codes in an unsupervised manner. Third, "A possible representation of reward in the learning of saccades" (Weber and Triesch, 2006) implements saccade learning with two possible learning schemes for horizontal and vertical saccades, respectively.
22. Default mode network model (Matsui et al 2014)
Default mode network (DMN) shows intrinsic, high-level activity at rest. We tested a hypothesis proposed for its role in sensory information processing: Intrinsic DMN activity facilitates neural responses to sensory input. A neural network model, consisting of a sensory network (Nsen) and a DMN, was simulated. The Nsen contained cell assemblies. Each cell assembly comprised principal cells, GABAergic interneurons (Ia, Ib), and glial cells. We let the Nsen carry out a perceptual task: detection of sensory stimuli. … This enabled the Nsen to reliably detect the stimulus. We suggest that intrinsic default model network activity may accelerate the reaction speed of the sensory network by modulating its ongoing-spontaneous activity in a subthreshold manner. Ambient GABA contributes to achieve an optimal ongoing spontaneous subthreshold neuronal state, in which GABAergic gliotransmission triggered by the intrinsic default model network activity may play an important role.
23. Development of orientation-selective simple cell receptive fields (Rishikesh and Venkatesh, 2003)
Implementation of a computational model for the development of simple-cell receptive fields spanning the regimes before and after eye-opening. The before eye-opening period is governed by a correlation-based rule from Miller (Miller, J. Neurosci., 1994), and the post eye-opening period is governed by a self-organizing, experience-dependent dynamics derived in the reference below.
24. Distributed cerebellar plasticity implements adaptable gain control (Garrido et al., 2013)
We tested the role of plasticity distributed over multiple synaptic sites (Hansel et al., 2001; Gao et al., 2012) by generating an analog cerebellar model embedded into a control loop connected to a robotic simulator. The robot used a three-joint arm and performed repetitive fast manipulations with different masses along an 8-shape trajectory. In accordance with biological evidence, the cerebellum model was endowed with both LTD and LTP at the PF-PC, MF-DCN and PC-DCN synapses. This resulted in a network scheme whose effectiveness was extended considerably compared to one including just PF-PC synaptic plasticity. Indeed, the system including distributed plasticity reliably self-adapted to manipulate different masses and to learn the arm-object dynamics over a time course that included fast learning and consolidation, along the lines of what has been observed in behavioral tests. In particular, PF-PC plasticity operated as a time correlator between the actual input state and the system error, while MF-DCN and PC-DCN plasticity played a key role in generating the gain controller. This model suggests that distributed synaptic plasticity allows generation of the complex learning properties of the cerebellum.
25. Duration-tuned neurons from the inferior colliculus of the big brown bat (Aubie et al. 2009)
dtnet is a generalized neural network simulator written in C++ with an easy to use XML description language to generate arbitrary neural networks and then run simulations covering many different parameter values. For example, you can specify ranges of parameter values for several different connection weights and then automatically run simulations over all possible parameters. Graphing ability is built in as long as the free, open-source, graphing application GLE ( is installed. Included in the examples folder are simulation descriptions that were used to generate the results in Aubie et al. (2009). Refer to the README file for instructions on compiling and running these examples. The most recent source code can be obtained from GitHub: <a href=""></a>
26. Dynamics in random NNs with multiple neuron subtypes (Pena et al 2018, Tomov et al 2014, 2016)
"Spontaneous cortical population activity exhibits a multitude of oscillatory patterns, which often display synchrony during slow-wave sleep or under certain anesthetics and stay asynchronous during quiet wakefulness. The mechanisms behind these cortical states and transitions among them are not completely understood. Here we study spontaneous population activity patterns in random networks of spiking neurons of mixed types modeled by Izhikevich equations. Neurons are coupled by conductance-based synapses subject to synaptic noise. We localize the population activity patterns on the parameter diagram spanned by the relative inhibitory synaptic strength and the magnitude of synaptic noise. In absence of noise, networks display transient activity patterns, either oscillatory or at constant level. The effect of noise is to turn transient patterns into persistent ones: for weak noise, all activity patterns are asynchronous non-oscillatory independently of synaptic strengths; for stronger noise, patterns have oscillatory and synchrony characteristics that depend on the relative inhibitory synaptic strength. ..."
27. Emergence of Connectivity Motifs in Networks of Model Neurons (Vasilaki, Giugliano 2014)
Recent evidence suggests that short-term dynamics of excitatory synaptic transmission is correlated to stereotypical connectivity motifs. We show that these connectivity motifs emerge in networks of model neurons, from the interactions between short-term synaptic dynamics (SD) and long-term spike-timing dependent plasticity (STDP).
28. Formation of synfire chains (Jun and Jin 2007)
"Temporally precise sequences of neuronal spikes that span hundreds of milliseconds are observed in many brain areas, including songbird premotor nucleus, cat visual cortex, and primary motor cortex. Synfire chains—networks in which groups of neurons are connected via excitatory synapses into a unidirectional chain—are thought to underlie the generation of such sequences. It is unknown, however, how synfire chains can form in local neural circuits, especially for long chains. Here, we show through computer simulation that long synfire chains can develop through spike-time dependent synaptic plasticity and axon remodeling—the pruning of prolific weak connections that follows the emergence of a finite number of strong connections. ..."
29. Generating oscillatory bursts from a network of regular spiking neurons (Shao et al. 2009)
Avian nucleus isthmi pars parvocellularis (Ipc) neurons are reciprocally connected with the tectal layer 10 (L10) neurons and respond with oscillatory bursts to visual stimulation. To elucidate mechanisms of oscillatory bursting in this network of regularly spiking neurons, we investigated an experimentally constrained model of coupled leaky integrate-and-fire neurons with spike-rate adaptation. The model reproduces the observed Ipc oscillatory bursting in response to simulated visual stimulation.
30. Hotspots of dendritic spine turnover facilitates new spines and NN sparsity (Frank et al 2018)
Model for the following publication: Adam C. Frank, Shan Huang, Miou Zhou, Amos Gdalyahu, George Kastellakis, Panayiota Poirazi, Tawnie K. Silva, Ximiao Wen, Joshua T. Trachtenberg, and Alcino J. Silva Hotspots of Dendritic Spine Turnover Facilitate Learning-related Clustered Spine Addition and Network Sparsity
31. Huntington`s disease model (Gambazzi et al. 2010)
"Although previous studies of Huntington’s disease (HD) have addressed many potential mechanisms of striatal neuron dysfunction and death, it is also known based on clinical findings that cortical function is dramatically disrupted in HD. With respect to disease etiology, however, the specific molecular and neuronal circuit bases for the cortical effects of mutant huntingtin (htt) have remained largely unknown. In the present work we studied the relation between the molecular effects of mutant htt fragments in cortical cells and the corresponding behavior of cortical neuron microcircuits using a novel cellular model of HD. We observed that a transcript-selective diminution in activity-dependent BDNF expression preceded the onset of a synaptic connectivity deficit in ex vivo cortical networks, which manifested as decreased spontaneous collective burst-firing behavior measured by multi-electrode array substrates. Decreased BDNF expression was determined to be a significant contributor to network-level dysfunction, as shown by the ability of exogenous BDNF to ameliorate cortical microcircuit burst firing. The molecular determinants of the dysregulation of activity-dependent BDNF expression by mutant htt appear to be distinct from previously elucidated mechanisms, as they do not involve known NRSF/REST-regulated promoter sequences, but instead result from dysregulation of BDNF exon IV and VI transcription. These data elucidate a novel HD-related deficit in BDNF gene regulation as a plausible mechanism of cortical neuron hypoconnectivity and cortical function deficits in HD. Moreover, the novel model paradigm established here is well-suited to further mechanistic and drug screening research applications. A simple mathematical model is proposed to interpret the observations and to explore the impact of specific synaptic dysfunctions on network activity. Interestingly, the model predicts a decrease in synaptic connectivity to be an early effect of mutant huntingtin in cortical neurons, supporting the hypothesis of decreased, rather than increased, synchronized cortical firing in HD."
32. Inhibition and glial-K+ interaction leads to diverse seizure transition modes (Ho & Truccolo 2016)
"How focal seizures initiate and evolve in human neocortex remains a fundamental problem in neuroscience. Here, we use biophysical neuronal network models of neocortical patches to study how the interaction between inhibition and extracellular potassium ([K+]o) dynamics may contribute to different types of focal seizures. Three main types of propagated focal seizures observed in recent intracortical microelectrode recordings in humans were modelled ..."
33. L4 cortical barrel NN model receiving thalamic input during whisking or touch (Gutnisky et al. 2017)
Excitatory neurons in layer 4 (L4) in the barrel cortex respond relatively strongly to touch but not to whisker movement (Yu et al., Nat. Neurosci. 2016). The model explains the mechanism underlying this effect. The network is settled to filter out most stationary inputs. Brief touch input passes through because it takes time until feed-forward inhibition silences excitatory neurons receiving brief and strong thalamic excitation.
34. Large cortex model with map-based neurons (Rulkov et al 2004)
We develop a new computationally efficient approach for the analysis of complex large-scale neurobiological networks. Its key element is the use of a new phenomenological model of a neuron capable of replicating important spike pattern characteristics and designed in the form of a system of difference equations (a map). ... Interconnected with synaptic currents these model neurons demonstrated responses very similar to those found with Hodgkin-Huxley models and in experiments. We illustrate the efficacy of this approach in simulations of one- and two-dimensional cortical network models consisting of regular spiking neurons and fast spiking interneurons to model sleep and activated states of the thalamocortical system. See paper for more.
35. Linking dynamics of the inhibitory network to the input structure (Komarov & Bazhenov 2016)
Code to model 10 all-to-all coupled inhibitory neurons.
36. Mechanisms of very fast oscillations in axon networks coupled by gap junctions (Munro, Borgers 2010)
Axons connected by gap junctions can produce very fast oscillations (VFOs, > 80 Hz) when stimulated randomly at a low rate. The models here explore the mechanisms of VFOs that can be seen in an axonal plexus, (Munro & Borgers, 2009): a large network model of an axonal plexus, small network models of axons connected by gap junctions, and an implementation of the model underlying figure 12 in Traub et al. (1999) . The large network model consists of 3,072 5-compartment axons connected in a random network. The 5-compartment axons are the 5 axonal compartments from the CA3 pyramidal cell model in Traub et al. (1994) with a fixed somatic voltage. The random network has the same parameters as the random network in Traub et al. (1999), and axons are stimulated randomly via a Poisson process with a rate of 2/s/axon. The small network models simulate waves propagating through small networks of axons connected by gap junctions to study how local connectivity affects the refractory period.
37. Microsaccades and synchrony coding in the retina (Masquelier et al. 2016)
We show that microsaccades (MS) enable efficient synchrony-based coding among the primate retinal ganglion cells (RGC). We find that each MS causes certain RGCs to fire synchronously, namely those whose receptive fields contain contrast edges after the MS. The emitted synchronous spike volley thus rapidly transmits the most salient edges of the stimulus. We demonstrate that the readout could be done rapidly by simple coincidence-detector neurons, and that the required connectivity could emerge spontaneously with spike timing-dependent plasticity.
38. Model of memory linking through memory allocation (Kastellakis et al. 2016)
Here, we present a simplified, biophysically inspired network model that incorporates multiple plasticity processes and explains linking of information at three different levels: (a) learning of a single associative memory (b) rescuing of a weak memory when paired with a strong one and (c) linking of multiple memories across time. By dissecting synaptic from intrinsic plasticity and neuron-wide from dendritically restricted protein capture, the model reveals a simple, unifying principle: Linked memories share synaptic clusters within the dendrites of overlapping populations of neurons
39. Models for cortical UP-DOWN states in a bistable inhibitory-stabilized network (Jercog et al 2017)
In the idling brain, neuronal circuits transition between periods of sustained firing (UP state) and quiescence (DOWN state), a pattern the mechanisms of which remain unclear. We analyzed spontaneous cortical population activity from anesthetized rats and found that UP and DOWN durations were highly variable and that population rates showed no significant decay during UP periods. We built a network rate model with excitatory (E) and inhibitory (I) populations exhibiting a novel bistable regime between a quiescent and an inhibition-stabilized state of arbitrarily low rate, where fluctuations triggered state transitions. In addition, we implemented these mechanisms in a more biophysically realistic spiking network, where DOWN-to-UP transitions are caused by synchronous high-amplitude events impinging onto the network.
40. NETMORPH: creates NNs with realistic neuron morphologies (Koene et al. 2009, van Ooyen et al. 2014)
NETMORPH is a simulation tool for building synaptically connected networks with realistic neuron morphologies. Axonal and dendritic morphologies are created by using stochastic rules for the behavior of individual growth cones, the structures at the tip of outgrowing axons and dendrites that mediate elongation and branching. Axons and dendrites are not guided by any extracellular cues. Synapses are formed when crossing axonal and dendritic segments come sufficiently close to each other. See the README in the archive for more information.
41. Network model with neocortical architecture (Anderson et al 2007,2012; Azhar et al 2012)
Architecturally realistic neocortical model using seven classes of excitatory and inhibitory single compartment Hodgkin-Huxley cells. This is an addendum to ModelDB Accession # 98902, Studies of stimulus parameters for seizure disruption (Anderson et al. 2007). Wiring is adapted from the minicolumn hypothesis and incorporates visual and neocortical wiring data. Simulation demonstrates spontaneous bursting onset and cessation. This activity can be induced by random fluctuations in the surrounding background input.
42. Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007)
This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005).
43. Neural mass model based on single cell dynamics to model pathophysiology (Zandt et al 2014)
The model code as described in "A neural mass model based on single cell dynamics to model pathophysiology, Zandt et al. 2014, Journal of Computational Neuroscience" A Neural mass model (NMM) derived from single cell dynamics in a bottom up approach. Mean and standard deviation of the firing rates in the populations are calculated. The sigmoid is derived from the single cell FI-curve, allowing for easy implementation of pathological conditions. NMM is compared with a detailed spiking network model consisting of HH neurons. NMM code in Matlab. The network model is simulated using Norns (ModelDB # 154739)
44. Neural modeling of an internal clock (Yamazaki and Tanaka 2008)
"We studied a simple random recurrent inhibitory network. Despite its simplicity, the dynamics was so rich that activity patterns of neurons evolved with time without recurrence due to random recurrent connections among neurons. The sequence of activity patterns was generated by the trigger of an external signal, and the generation was stable against noise.... Therefore, a time passage from the trigger of an external signal could be represented by the sequence of activity patterns, suggesting that this model could work as an internal clock. ..."
45. NMDAR & GABAB/KIR Give Bistable Dendrites: Working Memory & Sequence Readout (Sanders et al., 2013)
" ...Here, we show that the voltage dependence of the inwardly rectifying potassium (KIR) conductance activated by GABA(B) receptors adds substantial robustness to network simulations of bistability and the persistent firing that it underlies. The hyperpolarized state is robust because, at hyperpolarized potentials, the GABA(B)/KIR conductance is high and the NMDA conductance is low; the depolarized state is robust because, at depolarized potentials, the NMDA conductance is high and the GABA(B)/KIR conductance is low. Our results suggest that this complementary voltage dependence of GABA(B)/KIR and NMDA conductances makes them a "perfect couple" for producing voltage bistability."
46. Norns - Neural Network Studio (Visser & Van Gils 2014)
The Norns - Neural Network Studio is a software package for designing, simulation and analyzing networks of spiking neurons. It consists of three parts: 1. "Urd": a Matlab frontend with high-level functions for quickly defining networks 2. "Verdandi": an optimized C++ simulation environment which runs the simulation defined by Urd 3. "Skuld": an advanced Matlab graphical user interface (GUI) for visual inspection of simulated data.
47. Numerical Integration of Izhikevich and HH model neurons (Stewart and Bair 2009)
The Parker-Sochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. We apply the Parker-Sochacki method to the Izhikevich ‘simple’ model and a Hodgkin-Huxley type neuron, comparing the results with those obtained using the Runge-Kutta and Bulirsch-Stoer methods.
48. Olfactory bulb network model of gamma oscillations (Bathellier et al. 2006; Lagier et al. 2007)
This model implements a network of 100 mitral cells connected with asynchronous inhibitory "synapses" that is meant to reproduce the GABAergic transmission of ensembles of connected granule cells. For appropriate parameters of this special synapse the model generates gamma oscillations with properties very similar to what is observed in olfactory bulb slices (See Bathellier et al. 2006, Lagier et al. 2007). Mitral cells are modeled as single compartment neurons with a small number of different voltage gated channels. Parameters were tuned to reproduce the fast subthreshold oscillation of the membrane potential observed experimentally (see Desmaisons et al. 1999).
49. Optimal deep brain stimulation of the subthalamic nucleus-a computational study (Feng et al. 2007)
Here, we use a biophysically-based model of spiking cells in the basal ganglia (Terman et al., Journal of Neuroscience, 22, 2963-2976, 2002; Rubin and Terman, Journal of Computational Neuroscience, 16, 211-235, 2004) to provide computational evidence that alternative temporal patterns of DBS inputs might be equally effective as the standard high-frequency waveforms, but require lower amplitudes. Within this model, DBS performance is assessed in two ways. First, we determine the extent to which DBS causes Gpi (globus pallidus pars interna) synaptic outputs, which are burstlike and synchronized in the unstimulated Parkinsonian state, to cease their pathological modulation of simulated thalamocortical cells. Second, we evaluate how DBS affects the GPi cells' auto- and cross-correlograms.
50. Perceptual judgments via sensory-motor interaction assisted by cortical GABA (Hoshino et al 2018)
"Recurrent input to sensory cortex, via long-range reciprocal projections between motor and sensory cortices, is essential for accurate perceptual judgments. GABA levels in sensory cortices correlate with perceptual performance. We simulated a neuron-astrocyte network model to investigate how top-down, feedback signaling from a motor network (Nmot) to a sensory network (Nsen) affects perceptual judgments in association with ambient (extracellular) GABA levels. In the Nsen, astrocytic transporters modulated ambient GABA levels around pyramidal cells. A simple perceptual task was implemented: detection of a feature stimulus presented to the Nsen. ..."
51. Persistent synchronized bursting activity in cortical tissues (Golomb et al 2005)
The program simulates a one-dimensional model of a cortical tissue with excitatory and inhibitory populations.
52. Perturbation sensitivity implies high noise and suggests rate coding in cortex (London et al. 2010)
"... The network simulations were also based on a previously published model(Latham et al. 2000), but with modifications to allow the addition and detection of extra spikes (see Supplementary Information, section 7)."
53. Rate model of a cortical RS-FS-LTS network (Hayut et al. 2011)
A rate model of cortical networks composed of RS, FS and LTS neurons. Synaptic depression is modelled according to the Tsodyks-Markram scheme.
54. Reducing variability in motor cortex activity by GABA (Hoshino et al. 2019)
Interaction between sensory and motor cortices is crucial for perceptual decision-making, in which intracortical inhibition might have an important role. We simulated a neural network model consisting of a sensory network (NS) and a motor network (NM) to elucidate the significance of their interaction in perceptual decision-making in association with the level of GABA in extracellular space: extracellular GABA concentration. Extracellular GABA molecules acted on extrasynaptic receptors embedded in membranes of pyramidal cells and suppressed them. A reduction in extracellular GABA concentration either in NS or NM increased the rate of errors in perceptual decision-making, for which an increase in ongoing-spontaneous fluctuations in subthreshold neuronal activity in NM prior to sensory stimulation was responsible. Feedback (NM-to-NS) signaling enhanced selective neuronal responses in NS, which in turn increased stimulus-evoked neuronal activity in NM. We suggest that GABA in extracellular space contributes to reducing variability in motor cortex activity at a resting state and thereby the motor cortex can respond correctly to a subsequent sensory stimulus. Feedback signaling from the motor cortex improves the selective responsiveness of the sensory cortex, which ensures the fidelity of information transmission to the motor cortex, leading to reliable perceptual decision-making.
55. Relative spike time coding and STDP-based orientation selectivity in V1 (Masquelier 2012)
Phenomenological spiking model of the cat early visual system. We show how natural vision can drive spike time correlations on sufficiently fast time scales to lead to the acquisition of orientation-selective V1 neurons through STDP. This is possible without reference times such as stimulus onsets, or saccade landing times. But even when such reference times are available, we demonstrate that the relative spike times encode the images more robustly than the absolute ones.
56. Robust Reservoir Generation by Correlation-Based Learning (Yamazaki & Tanaka 2008)
"Reservoir computing (RC) is a new framework for neural computation. A reservoir is usually a recurrent neural network with fixed random connections. In this article, we propose an RC model in which the connections in the reservoir are modifiable. ... We apply our RC model to trace eyeblink conditioning. The reservoir bridged the gap of an interstimulus interval between the conditioned and unconditioned stimuli, and a readout neuron was able to learn and express the timed conditioned response."
57. Sleep-wake transitions in corticothalamic system (Bazhenov et al 2002)
The authors investigate the transition between sleep and awake states with intracellular recordings in cats and computational models. The model describes many essential features of slow wave sleep and activated states as well as the transition between them.
58. Spinal circuits controlling limb coordination and gaits in quadrupeds (Danner et al 2017)
Simulation of spinal neural networks involved in the central control of interlimb coordination and speed-dependent gait expression in quadrupeds.
59. STDP promotes synchrony of inhibitory networks in the presence of heterogeneity (Talathi et al 2008)
"Recently Haas et al. (J Neurophysiol 96: 3305–3313, 2006), observed a novel form of spike timing dependent plasticity (iSTDP) in GABAergic synaptic couplings in layer II of the entorhinal cortex. Depending on the relative timings of the presynaptic input at time tpre and the postsynaptic excitation at time tpost, the synapse is strengthened (delta_t = t(post) - t(pre) > 0) or weakened (delta_t < 0). The temporal dynamic range of the observed STDP rule was found to lie in the higher gamma frequency band (> or = 40 Hz), a frequency range important for several vital neuronal tasks. In this paper we study the function of this novel form of iSTDP in the synchronization of the inhibitory neuronal network. In particular we consider a network of two unidirectionally coupled interneurons (UCI) and two mutually coupled interneurons (MCI), in the presence of heterogeneity in the intrinsic firing rates of each coupled neuron. ..."
60. Stochastic and periodic inputs tune ongoing oscillations (Hutt et al. 2016)
" ... We here analyze a network of recurrently connected spiking neurons with time delay displaying stable synchronous dynamics. Using mean-field and stability analyses, we investigate the influence of dynamic inputs on the frequency of firing rate oscillations. ..."
61. Studies of stimulus parameters for seizure disruption using NN simulations (Anderson et al. 2007)
Architecturally realistic neocortical model using seven classes of excitatory and inhibitory single compartment Hodgkin-Huxley cells. Wiring is adapted to minicolumn hypothesis and incorporates visual and neocortical data. Simulation demonstrates spontaneous bursting onset and cessation, and activity can be altered with external electric field.
62. Temporal integration by stochastic recurrent network (Okamoto et al. 2007)
"Temporal integration of externally or internally driven information is required for a variety of cognitive processes. This computation is generally linked with graded rate changes in cortical neurons, which typically appear during a delay period of cognitive task in the prefrontal and other cortical areas. Here, we present a neural network model to produce graded (climbing or descending) neuronal activity. Model neurons are interconnected randomly by AMPA-receptor–mediated fast excitatory synapses and are subject to noisy background excitatory and inhibitory synaptic inputs. In each neuron, a prolonged afterdepolarizing potential follows every spike generation. Then, driven by an external input, the individual neurons display bimodal rate changes between a baseline state and an elevated firing state, with the latter being sustained by regenerated afterdepolarizing potentials. ..."

Re-display model names without descriptions