Circuits that contain the Modeling Application : Python (Home Page)

("Python is a dynamic object-oriented programming language that can be used for many kinds of software development. It offers strong support for integration with other languages and tools, comes with extensive standard libraries, and can be learned in a few days. Many Python programmers report substantial productivity gains and feel the language encourages the development of higher quality, more maintainable code. ...")
Re-display model names without descriptions
    Models   Description
1. 3D olfactory bulb: operators (Migliore et al, 2015)
"... Using a 3D model of mitral and granule cell interactions supported by experimental findings, combined with a matrix-based representation of glomerular operations, we identify the mechanisms for forming one or more glomerular units in response to a given odor, how and to what extent the glomerular units interfere or interact with each other during learning, their computational role within the olfactory bulb microcircuit, and how their actions can be formalized into a theoretical framework in which the olfactory bulb can be considered to contain "odor operators" unique to each individual. ..."
2. A contracting model of the basal ganglia (Girard et al. 2008)
Basal ganglia model : selection processes between channels, dynamics controlled by contraction analysis, rate-coding model of neurons based on locally projected dynamical systems (lPDS).
3. A network of AOB mitral cells that produces infra-slow bursting (Zylbertal et al. 2017)
Infra-slow rhythmic neuronal activity with very long (> 10 s) period duration was described in many brain areas but little is known about the role of this activity and the mechanisms that produce it. Here we combine experimental and computational methods to show that synchronous infra-slow bursting activity in mitral cells of the mouse accessory olfactory bulb (AOB) emerges from interplay between intracellular dynamics and network connectivity. In this novel mechanism, slow intracellular Na+ dynamics endow AOB mitral cells with a weak tendency to burst, which is further enhanced and stabilized by chemical and electrical synapses between them. Combined with the unique topology of the AOB network, infra-slow bursting enables integration and binding of multiple chemosensory stimuli over prolonged time scale. The example protocol simulates a two-glomeruli network with a single shared cell. Although each glomerulus is stimulated at a different time point, the activity of the entire population becomes synchronous (see paper Fig. 8)
4. A single-cell spiking model for the origin of grid-cell patterns (D'Albis & Kempter 2017)
A single-cell spiking model explaining the formation of grid-cell pattern in a feed-forward network. Patterns emerge via spatially-tuned feedforward inputs, synaptic plasticity, and spike-rate adaptation.
5. A spatial model of the intermediate superior colliculus (Moren et. al. 2013)
A spatial model of the intermediate superior colliculus. It reproduces the collicular saccade-generating output profile from NMDA receptor-driven burst neurons, shaped by integrative inhibitory feedback from spreading buildup neuron activity. The model is consistent with the view that collicular activity directly shapes the temporal profile of saccadic eye movements. We use the Adaptive exponential integrate and fire neuron model, augmented with an NMDA-like membrane potential-dependent receptor. In addition, we use a synthetic spike integrator model as a stand-in for a spike-integrator circuit in the reticular formation. NOTE: We use a couple of custom neuron models, so the supplied model file includes an entire version of NEST. I also include a patch that applies to a clean version of the simulator (see the doc/README).
6. Activity patterns in a subthalamopallidal network of the basal ganglia model (Terman et al 2002)
"Based on recent experimental data, we have developed a conductance-based computational network model of the subthalamic nucleus and the external segment of the globus pallidus in the indirect pathway of the basal ganglia. Computer simulations and analysis of this model illuminate the roles of the coupling architecture of the network, and associated synaptic conductances, in modulating the activity patterns displayed by this network. Depending on the relationships of these coupling parameters, the network can support three general classes of sustained firing patterns: clustering, propagating waves, and repetitive spiking that may show little regularity or correlation. ...". Terman's XPP code and a partial implementation by Taylor Malone in NEURON and python are included.
7. An attractor network model of grid cells and theta-nested gamma oscillations (Pastoll et al 2013)
A two population spiking continuous attractor model of grid cells. This model combines the attractor dynamics with theta-nested gamma oscillatory activity. It reproduces the behavioural response of grid cells (grid fields) in medial entorhinal cortex, while at the same time allowing for nested gamma oscillations of post-synaptic currents.
8. Biophysical model for field potentials of networks of I&F neurons (beim Graben & Serafim 2013)
"... Starting from a reduced three-compartment model of a single pyramidal neuron, we derive an observation model for dendritic dipole currents in extracellular space and thereby for the dendritic field potential (DFP) that contributes to the local field potential (LFP) of a neural population. ... Our reduced three-compartment scheme allows to derive networks of leaky integrate-and-fire (LIF) models, which facilitates comparison with existing neural network and observation models. ..."
9. CA1 pyr cell: Inhibitory modulation of spatial selectivity+phase precession (Grienberger et al 2017)
Spatially uniform synaptic inhibition enhances spatial selectivity and temporal coding in CA1 place cells by suppressing broad out-of-field excitation.
10. CA1 pyramidal neuron network model (Ferguson et al 2015)
From the paper: Figure 4 (1000 cell network) is reproduced by running this brian code. The raster plot and one of the excitatory cell voltage is produced.
11. CA3 Network Model of Epileptic Activity (Sanjay et. al, 2015)
This computational study investigates how a CA3 neuronal network consisting of pyramidal cells, basket cells and OLM interneurons becomes epileptic when dendritic inhibition to pyramidal cells is impaired due to the dysfunction of OLM interneurons. After standardizing the baseline activity (theta-modulated gamma oscillations), systematic changes are made in the connectivities between the neurons, as a result of step-wise impairment of dendritic inhibition.
12. Composite spiking network/neural field model of Parkinsons (Kerr et al 2013)
This code implements a composite model of Parkinson's disease (PD). The composite model consists of a leaky integrate-and-fire spiking neuronal network model being driven by output from a neural field model (instead of the more usual white noise drive). Three different sets of parameters were used for the field model: one with basal ganglia parameters based on data from healthy individuals, one based on data from individuals with PD, and one purely thalamocortical model. The aim of this model is to explore how the different dynamical patterns in each each of these field models affects the activity in the network model.
13. Computing with neural synchrony (Brette 2012)
"... In a heterogeneous neural population, it appears that synchrony patterns represent structure or sensory invariants in stimuli, which can then be detected by postsynaptic neurons. The required neural circuitry can spontaneously emerge with spike-timing-dependent plasticity. Using examples in different sensory modalities, I show that this allows simple neural circuits to extract relevant information from realistic sensory stimuli, for example to identify a fluctuating odor in the presence of distractors. ..."
14. Connection-set Algebra (CSA) for the representation of connectivity in NN models (Djurfeldt 2012)
"The connection-set algebra (CSA) is a novel and general formalism for the description of connectivity in neuronal network models, from small-scale to large-scale structure. ... The expressiveness of CSA makes prototyping of network structure easy. A C++ version of the algebra has been implemented and used in a large-scale neuronal network simulation (Djurfeldt et al., IBM J Res Dev 52(1/2):31–42, 2008b) and an implementation in Python has been publicly released."
15. Cortical Basal Ganglia Network Model during Closed-loop DBS (Fleming et al 2020)
We developed a computational model of the cortical basal ganglia network to investigate closed-loop control of deep brain stimulation (DBS) for Parkinson’s disease (PD). The cortical basal ganglia network model incorporates the (i) the extracellular DBS electric field, (ii) antidromic and orthodromic activation of STN afferent fibers, (iii) the LFP detected at non-stimulating contacts on the DBS electrode and (iv) temporal variation of network beta-band activity within the thalamo-cortico-basal ganglia loop. The model facilitates investigation of clinically-viable closed-loop DBS control approaches, modulating either DBS amplitude or frequency, using an LFP derived measure of network beta-activity.
16. Cortical feedback alters visual response properties of dLGN relay cells (Martínez-Cañada et al 2018)
Network model that includes biophysically detailed, single-compartment and multicompartment neuron models of relay-cells and interneurons in the dLGN and a population of orientation-selective layer 6 simple cells, consisting of pyramidal cells (PY). We have considered two different arrangements of synaptic feedback from the ON and OFF zones in the visual cortex to the dLGN: phase-reversed (‘push-pull’) and phase-matched (‘push-push’), as well as different spatial extents of the corticothalamic projection pattern. This project is the result of a research work and its associated publication is: (Martínez-Cañada et al 2018). Installation instructions as well as the latest version can be found in the Github repository: https://github.com/CINPLA/biophysical_thalamocortical_system
17. CRH modulates excitatory transmission and network physiology in hippocampus (Gunn et al. 2017)
This model simulates the effects of CRH on sharp waves in a rat CA1/CA3 model. It uses the frequency of the sharp waves as an output of the network.
18. Current Dipole in Laminar Neocortex (Lee et al. 2013)
Laminar neocortical model in NEURON/Python, adapted from Jones et al 2009. https://bitbucket.org/jonescompneurolab/corticaldipole
19. Dentate Gyrus model including Granule cells with dendritic compartments (Chavlis et al 2017)
Here we investigate the role of dentate granule cell dendrites in pattern separation. The model consists of point neurons (Integrate and fire) and in principal neurons, the granule cells, we have incorporated various number of dendrites.
20. Diffusive homeostasis in a spiking network model (Sweeney et al. 2015)
In this paper we propose a new mechanism, diffusive homeostasis, in which neural excitability is modulated by nitric oxide, a gas which can flow freely across cell membranes. Our model simulates the activity-dependent synthesis and diffusion of nitric oxide in a recurrent network model of integrate-and-fire neurons. The concentration of nitric oxide is then used as homeostatic readout which modulates the firing threshold of each neuron.
21. Distal inhibitory control of sensory-evoked excitation (Egger, Schmitt et al. 2015)
Model of a cortical layer (L) 2 pyramidal neuron embedded in an anatomically realistic network of two barrel columns in rat vibrissal cortex. This model is used to investigate the effects of spatially and temporally specific inhibition from L1 inhibitory interneurons on the sensory-evoked subthreshold responses of the L2 pyramidal neuron, and can be used to create simulation results underlying Figures 3D, 4B, 4C and 4E from (Egger, Schmitt et al. 2015).
22. Duration-tuned neurons from the inferior colliculus of vertebrates (Aubie et al. 2012)
These models reproduce the responses of duration-tuned neurons in the auditory midbrain of the big brown bat, the rat, the mouse and the frog (Aubie et al. 2012). They are written in the Python interface to NEURON and a subset of the figures from Aubie et al. (2012) are pre-set in run.py (raw data is generated and a separate graphing program must be used to visualize the results).
23. Effect of polysynaptic facilitaiton between piriform-hippocampal network stages (Trieu et al 2015)
This is a model of a multistage network with stages representing regions and synaptic contacts from the olfactory cortex to region CA1 of the hippocampus in Brian2 spiking neural network simulator (Trieu et al 2015). It is primarily designed to assess how synaptic facilitation at multiple stages in response to theta firing changes the output of the network. Further developments will be posted at: github.com/cdcox/multistage_network This model was prepared by Conor D Cox, University of California, Irvine For questions please contact Conor at cdcox1@gmail.com
24. Electrostimulation to reduce synaptic scaling driven progression of Alzheimers (Rowan et al. 2014)
"... As cells die and synapses lose their drive, remaining cells suffer an initial decrease in activity. Neuronal homeostatic synaptic scaling then provides a feedback mechanism to restore activity. ... The scaling mechanism increases the firing rates of remaining cells in the network to compensate for decreases in network activity. However, this effect can itself become a pathology, ... Here, we present a mechanistic explanation of how directed brain stimulation might be expected to slow AD progression based on computational simulations in a 470-neuron biomimetic model of a neocortical column. ... "
25. Functional balanced networks with synaptic plasticity (Sadeh et al, 2015)
The model investigates the impact of learning on functional sensory networks. It uses large-scale recurrent networks of excitatory and inhibitory spiking neurons equipped with synaptic plasticity. It explains enhancement of orientation selectivity and emergence of feature-specific connectivity in visual cortex of rodents during development, as reported in experiments.
26. Gamma-beta alternation in the olfactory bulb (David, Fourcaud-Trocmé et al., 2015)
This model, a simplified olfactory bulb network with mitral and granule cells, proposes a framework for two regimes of oscillation in the olfactory bulb: 1 - a weak inhibition regime (with no granule spike) where the network oscillates in the gamma (40-90Hz) band 2 - a strong inhibition regime (with granule spikes) where the network oscillates in the beta (15-30Hz) band. Slow modulations of sensory and centrifugal inputs, phase shifted by a quarter of cycle, possibly combined with short term depression of the mitral to granule AMPA synapse, allows the network to alternate between the two regimes as observed in anesthetized animals.
27. Gap junction plasticity as a mechanism to regulate network-wide oscillations (Pernelle et al 2018)
"Oscillations of neural activity emerge when many neurons repeatedly activate together and are observed in many brain regions, particularly during sleep and attention. Their functional role is still debated, but could be associated with normal cognitive processes such as memory formation or with pathologies such as schizophrenia and autism. Powerful oscillations are also a hallmark of epileptic seizures. Therefore, we wondered what mechanism could regulate oscillations. A type of neuronal coupling, called gap junctions, has been shown to promote synchronization between inhibitory neurons. Computational models show that when gap junctions are strong, neurons synchronize together. Moreover recent investigations show that the gap junction coupling strength is not static but plastic and dependent on the firing properties of the neurons. Thus, we developed a model of gap junction plasticity in a network of inhibitory and excitatory neurons. We show that gap junction plasticity can maintain the right amount of oscillations to prevent pathologies from emerging. Finally, we show that gap junction plasticity serves an additional functional role and allows for efficient and robust information transfer."
28. Hierarchical network model of perceptual decision making (Wimmer et al 2015)
Neuronal variability in sensory cortex predicts perceptual decisions. To investigate the interaction of bottom-up and top-down mechanisms during the decision process, we developed a hierarchical network model. The network consists of two circuits composed of leaky integrate-and-fire neurons: an integration circuit (e.g. LIP, FEF) and a sensory circuit (MT), recurrently coupled via bottom-up feedforward connections and top-down feedback connections. The integration circuit accumulates sensory evidence and produces a binary categorization due to winner-take-all competition between two decision-encoding populations (X.J. Wang, Neuron, 2002). The sensory circuit is a balanced randomly connected EI-network, that contains neural populations selective to opposite directions of motion. We have used this model to simulate a standard two-alternative forced-choice motion discrimination task.
29. Hopfield and Brody model (Hopfield, Brody 2000) (NEURON+python)
Demonstration of Hopfield-Brody snychronization using artificial cells in NEURON+python.
30. Ih tunes oscillations in an In Silico CA3 model (Neymotin et al. 2013)
" ... We investigated oscillatory control using a multiscale computer model of hippocampal CA3, where each cell class (pyramidal, basket, and oriens-lacunosum moleculare cells), contained type-appropriate isoforms of Ih. Our model demonstrated that modulation of pyramidal and basket Ih allows tuning theta and gamma oscillation frequency and amplitude. Pyramidal Ih also controlled cross-frequency coupling (CFC) and allowed shifting gamma generation towards particular phases of the theta cycle, effected via Ih’s ability to set pyramidal excitability. ..."
31. Inhibitory neuron plasticity as a mechanism for ocular dominance plasticity (Bono & Clopath 2019)
"Ocular dominance plasticity is a well-documented phenomenon allowing us to study properties of cortical maturation. Understanding this maturation might be an important step towards unravelling how cortical circuits function. However, it is still not fully understood which mechanisms are responsible for the opening and closing of the critical period for ocular dominance and how changes in cortical responsiveness arise after visual deprivation. In this article, we present a theory of ocular dominance plasticity. Following recent experimental work, we propose a framework where a reduction in inhibition is necessary for ocular dominance plasticity in both juvenile and adult animals. In this framework, two ingredients are crucial to observe ocular dominance shifts: a sufficient level of inhibition as well as excitatory-to-inhibitory synaptic plasticity. In our model, the former is responsible for the opening of the critical period, while the latter limits the plasticity in adult animals. Finally, we also provide a possible explanation for the variability in ocular dominance shifts observed in individual neurons and for the counter-intuitive shifts towards the closed eye."
32. Input strength and time-varying oscillation peak frequency (Cohen MX 2014)
The purpose of this paper is to argue that a single neural functional principle—temporal fluctuations in oscillation peak frequency (“frequency sliding”)—can be used as a common analysis approach to bridge multiple scales within neuroscience. The code provided here recreates the network models used to demonstrate changes in peak oscillation frequency as a function of static and time-varying input strength, and also shows how correlated frequency sliding can be used to identify functional connectivity between two networks.
33. Interplay between somatic and dendritic inhibition promotes place fields (Pedrosa & Clopath 2020)
Hippocampal pyramidal neurons are thought to encode spatial information. A subset of these cells, named place cells, are active only when the animal traverses a specific region within the environment. Although vastly studied experimentally, the development and stabilization of place fields are not fully understood. Here, we propose a mechanistic model of place cell formation in the hippocampal CA1 region. Using our model, we reproduce place field dynamics observed experimentally and provide a mechanistic explanation for the stabilization of place fields. Finally, our model provides specific predictions on protocols to shift place field location.
34. Ketamine disrupts theta modulation of gamma in a computer model of hippocampus (Neymotin et al 2011)
"Abnormalities in oscillations have been suggested to play a role in schizophrenia. We studied theta-modulated gamma oscillations in a computer model of hippocampal CA3 in vivo with and without simulated application of ketamine, an NMDA receptor antagonist and psychotomimetic. Networks of 1200 multi-compartment neurons (pyramidal, basket and oriens-lacunosum moleculare, OLM, cells) generated theta and gamma oscillations from intrinsic network dynamics: basket cells primarily generated gamma and amplified theta, while OLM cells strongly contributed to theta. ..."
35. Large-scale neural model of visual short-term memory (Ulloa, Horwitz 2016; Horwitz, et al. 2005,...)
Large-scale neural model of visual short term memory embedded into a 998-node connectome. The model simulates electrical activity across neuronal populations of a number of brain regions and converts that activity into fMRI and MEG time-series. The model uses a neural simulator developed at the Brain Imaging and Modeling Section of the National Institutes of Health.
36. Mesoscopic dynamics from AdEx recurrent networks (Zerlaut et al JCNS 2018)
We present a mean-field model of networks of Adaptive Exponential (AdEx) integrate-and-fire neurons, with conductance-based synaptic interactions. We study a network of regular-spiking (RS) excitatory neurons and fast-spiking (FS) inhibitory neurons. We use a Master Equation formalism, together with a semi-analytic approach to the transfer function of AdEx neurons to describe the average dynamics of the coupled populations. We compare the predictions of this mean-field model to simulated networks of RS-FS cells, first at the level of the spontaneous activity of the network, which is well predicted by the analytical description. Second, we investigate the response of the network to time-varying external input, and show that the mean-field model predicts the response time course of the population. Finally, to model VSDi signals, we consider a one-dimensional ring model made of interconnected RS-FS mean-field units.
37. Mesoscopic dynamics from AdEx recurrent networks (Zerlaut et al JCNS 2018) (PyNN)
PyNN simulations for Zerlaut et al 2018).
38. Minimal model of interictal and ictal discharges “Epileptor-2” (Chizhov et al 2018)
"Seizures occur in a recurrent manner with intermittent states of interictal and ictal discharges (IIDs and IDs). The transitions to and from IDs are determined by a set of processes, including synaptic interaction and ionic dynamics. Although mathematical models of separate types of epileptic discharges have been developed, modeling the transitions between states remains a challenge. A simple generic mathematical model of seizure dynamics (Epileptor) has recently been proposed by Jirsa et al. (2014); however, it is formulated in terms of abstract variables. In this paper, a minimal population-type model of IIDs and IDs is proposed that is as simple to use as the Epileptor, but the suggested model attributes physical meaning to the variables. The model is expressed in ordinary differential equations for extracellular potassium and intracellular sodium concentrations, membrane potential, and short-term synaptic depression variables. A quadratic integrate-and-fire model driven by the population input current is used to reproduce spike trains in a representative neuron. ..."
39. Model of eupnea and sigh generation in respiratory network (Toporikova et al 2015)
Based on recent in vitro data obtained in the mouse embryo, we have built a computational model consisting of two compartments, interconnected through appropriate synapses. One compartment generates sighs and the other produces eupneic bursts. The model reproduces basic features of simultaneous sigh and eupnea generation (two types of bursts differing in terms of shape, amplitude, and frequency of occurrence) and mimics the effect of blocking glycinergic synapses
40. Modeling and MEG evidence of early consonance processing in auditory cortex (Tabas et al 2019)
Pitch is a fundamental attribute of auditory perception. The interaction of concurrent pitches gives rise to a sensation that can be characterized by its degree of consonance or dissonance. In this work, we propose that human auditory cortex (AC) processes pitch and consonance through a common neural network mechanism operating at early cortical levels. First, we developed a new model of neural ensembles incorporating realistic neuronal and synaptic parameters to assess pitch processing mechanisms at early stages of AC. Next, we designed a magnetoencephalography (MEG) experiment to measure the neuromagnetic activity evoked by dyads with varying degrees of consonance or dissonance. MEG results show that dissonant dyads evoke a pitch onset response (POR) with a latency up to 36 ms longer than consonant dyads. Additionally, we used the model to predict the processing time of concurrent pitches; here, consonant pitch combinations were decoded faster than dissonant combinations, in line with the experimental observations. Specifically, we found a striking match between the predicted and the observed latency of the POR as elicited by the dyads. These novel results suggest that consonance processing starts early in human auditory cortex and may share the network mechanisms that are responsible for (single) pitch processing.
41. Modeling dendritic spikes and plasticity (Bono and Clopath 2017)
Biophysical model and reduced neuron model with voltage-dependent plasticity.
42. Modelling platform of the cochlear nucleus and other auditory circuits (Manis & Compagnola 2018)
"Models of the auditory brainstem have been an invaluable tool for testing hypotheses about auditory information processing and for highlighting the most important gaps in the experimental literature. Due to the complexity of the auditory brainstem, and indeed most brain circuits, the dynamic behavior of the system may be difficult to predict without a detailed, biologically realistic computational model. Despite the sensitivity of models to their exact construction and parameters, most prior models of the cochlear nucleus have incorporated only a small subset of the known biological properties. This confounds the interpretation of modelling results and also limits the potential future uses of these models, which require a large effort to develop. To address these issues, we have developed a general purpose, bio-physically detailed model of the cochlear nucleus for use both in testing hypotheses about cochlear nucleus function and also as an input to models of downstream auditory nuclei. The model implements conductance-based Hodgkin-Huxley representations of cells using a Python-based interface to the NEURON simulator. ..."
43. Modular grid cell responses as a basis for hippocampal remapping (Monaco and Abbott 2011)
"Hippocampal place fields, the local regions of activity recorded from place cells in exploring rodents, can undergo large changes in relative location during remapping. This process would appear to require some form of modulated global input. Grid-cell responses recorded from layer II of medial entorhinal cortex in rats have been observed to realign concurrently with hippocampal remapping, making them a candidate input source. However, this realignment occurs coherently across colocalized ensembles of grid cells (Fyhn et al., 2007). The hypothesized entorhinal contribution to remapping depends on whether this coherence extends to all grid cells, which is currently unknown. We study whether dividing grid cells into small numbers of independently realigning modules can both account for this localized coherence and allow for hippocampal remapping. ..."
44. Motion Clouds: Synthesis of random textures for motion perception (Leon et al. 2012)
We describe a framework to generate random texture movies with controlled information content. In particular, these stimuli can be made closer to naturalistic textures compared to usual stimuli such as gratings and random-dot kinetograms. We simplified the definition to parametrically define these "Motion Clouds" around the most prevalent feature axis (mean and bandwith): direction, spatial frequency, orientation.
45. Motor system model with reinforcement learning drives virtual arm (Dura-Bernal et al 2017)
"We implemented a model of the motor system with the following components: dorsal premotor cortex (PMd), primary motor cortex (M1), spinal cord and musculoskeletal arm (Figure 1). PMd modulated M1 to select the target to reach, M1 excited the descending spinal cord neurons that drove the arm muscles, and received arm proprioceptive feedback (information about the arm position) via the ascending spinal cord neurons. The large-scale model of M1 consisted of 6,208 spiking Izhikevich model neurons [37] of four types: regular-firing and bursting pyramidal neurons, and fast-spiking and low-threshold-spiking interneurons. These were distributed across cortical layers 2/3, 5A, 5B and 6, with cell properties, proportions, locations, connectivity, weights and delays drawn primarily from mammalian experimental data [38], [39], and described in detail in previous work [29]. The network included 486,491 connections, with synapses modeling properties of four different receptors ..."
46. Multitarget pharmacology for Dystonia in M1 (Neymotin et al 2016)
" ... We developed a multiscale model of primary motor cortex, ranging from molecular, up to cellular, and network levels, containing 1715 compartmental model neurons with multiple ion channels and intracellular molecular dynamics. We wired the model based on electrophysiological data obtained from mouse motor cortex circuit mapping experiments. We used the model to reproduce patterns of heightened activity seen in dystonia by applying independent random variations in parameters to identify pathological parameter sets. ..."
47. Muscle spindle feedback circuit (Moraud et al, 2016)
Here, we developed a computational model of the muscle spindle feedback circuits of the rat ankle that predicts the interactions between Epidural Stimulation and spinal circuit dynamics during gait.
48. Network bursts in cultured NN result from different adaptive mechanisms (Masquelier & Deco 2013)
It is now well established that cultured neuron networks are spontaneously active, and tend to synchronize. Synchronous events typically involve the whole network, and have thus been termed “network spikes” (NS). Using experimental recordings and numerical simulations, we show here that the inter-NS interval statistics are complex, and allow inferring the neural mechanisms at work, in particular the adaptive ones, and estimating a number of parameters to which we cannot access experimentally.
49. Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007)
This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005).
50. Neural mass model based on single cell dynamics to model pathophysiology (Zandt et al 2014)
The model code as described in "A neural mass model based on single cell dynamics to model pathophysiology, Zandt et al. 2014, Journal of Computational Neuroscience" A Neural mass model (NMM) derived from single cell dynamics in a bottom up approach. Mean and standard deviation of the firing rates in the populations are calculated. The sigmoid is derived from the single cell FI-curve, allowing for easy implementation of pathological conditions. NMM is compared with a detailed spiking network model consisting of HH neurons. NMM code in Matlab. The network model is simulated using Norns (ModelDB # 154739)
51. Olfactory bulb microcircuits model with dual-layer inhibition (Gilra & Bhalla 2015)
A detailed network model of the dual-layer dendro-dendritic inhibitory microcircuits in the rat olfactory bulb comprising compartmental mitral, granule and PG cells developed by Aditya Gilra, Upinder S. Bhalla (2015). All cell morphologies and network connections are in NeuroML v1.8.0. PG and granule cell channels and synapses are also in NeuroML v1.8.0. Mitral cell channels and synapses are in native python.
52. Orientation selectivity in inhibition-dominated recurrent networks (Sadeh and Rotter, 2015)
Emergence of contrast-invariant orientation selectivity in large-scale networks of excitatory and inhibitory neurons using integrate-and-fire neuron models.
53. Oscillations, phase-of-firing coding and STDP: an efficient learning scheme (Masquelier et al. 2009)
The model demonstrates how a common oscillatory drive for a group of neurons formats and reliabilizes their spike times - through an activation-to-phase conversion - so that repeating activation patterns can be easily detected and learned by a downstream neuron equipped with STDP, and then recognized in just one oscillation cycle.
54. Phase response theory in sparsely + strongly connected inhibitory NNs (Tikidji-Hamburyan et al 2019)
55. PIR gamma oscillations in network of resonators (Tikidji-Hamburyan et al. 2015)
" ... The coupled oscillator model implemented with Wang–Buzsaki model neurons is not sufficiently robust to heterogeneity in excitatory drive, and therefore intrinsic frequency, to account for in vitro models of ING. Similarly, in a tightly synchronized regime, the stochastic population oscillator model is often characterized by sparse firing, whereas interneurons both in vivo and in vitro do not fire sparsely during gamma,but rather on average every other cycle. We substituted so-called resonator neural models, which exhibit class 2 excitability and postinhibitory rebound (PIR), for the integrators that are typically used. This results in much greater robustness to heterogeneity that actually increases as the average participation in spikes per cycle approximates physiological levels. Moreover, dynamic clamp experiments that show autapse-induced firing in entorhinal cortical interneurons support the idea that PIR can serve as a network gamma mechanism. ..."
56. Place and grid cells in a loop (Rennó-Costa & Tort 2017)
This model implements a loop circuit between place and grid cells. The model was used to explain place cell remapping and grid cell realignment. Grid cell model as a continuous attractor network. Place cells have recurrent attractor network. Rate models implemented with E%-MAX winner-take-all network dynamics, with gamma cycle time-step.
57. Potjans-Diesmann cortical microcircuit model in NetPyNE (Romaro et al 2021)
The Potjans-Diesmann cortical microcircuit model is a widely used model originally implemented in NEST. Here, we re-implemented the model using NetPyNE, a high-level Python interface to the NEURON simulator, and reproduced the findings of the original publication. We also implemented a method for rescaling the network size which preserves first and second order statistics, building on existing work on network theory. The new implementation enables using more detailed neuron models with multicompartment morphologies and multiple biophysically realistic channels. This opens the model to new research, including the study of dendritic processing, the influence of individual channel parameters, and generally multiscale interactions in the network. The rescaling method provides flexibility to increase or decrease the network size if required when running these more realistic simulations. Finally, NetPyNE facilitates modifying or extending the model using its declarative language; optimizing model parameters; running efficient large-scale parallelized simulations; and analyzing the model through built-in methods, including local field potential calculation and information flow measures.
58. Recurrent amplification of grid-cell activity (D'Albis and Kempter 2020)
59. Reward modulated STDP (Legenstein et al. 2008)
"... This article provides tools for an analytic treatment of reward-modulated STDP, which allows us to predict under which conditions reward-modulated STDP will achieve a desired learning effect. These analytical results imply that neurons can learn through reward-modulated STDP to classify not only spatial but also temporal firing patterns of presynaptic neurons. They also can learn to respond to specific presynaptic firing patterns with particular spike patterns. Finally, the resulting learning theory predicts that even difficult credit-assignment problems, where it is very hard to tell which synaptic weights should be modified in order to increase the global reward for the system, can be solved in a self-organizing manner through reward-modulated STDP. This yields an explanation for a fundamental experimental result on biofeedback in monkeys by Fetz and Baker. In this experiment monkeys were rewarded for increasing the firing rate of a particular neuron in the cortex and were able to solve this extremely difficult credit assignment problem. ... In addition our model demonstrates that reward-modulated STDP can be applied to all synapses in a large recurrent neural network without endangering the stability of the network dynamics."
60. Scaffold model of mouse CA1 hippocampus. (Gandolfi et al 2022)
The model allows to connect point neurons based on probability clouds generated on morpho-anatomical landmarks
61. SCN1A gain-of-function in early infantile encephalopathy (Berecki et al 2019)
"OBJECTIVE: To elucidate the biophysical basis underlying the distinct and severe clinical presentation in patients with the recurrent missense SCN1A variant, p.Thr226Met. Patients with this variant show a well-defined genotype-phenotype correlation and present with developmental and early infantile epileptic encephalopathy that is far more severe than typical SCN1A Dravet syndrome. METHODS: Whole cell patch clamp and dynamic action potential clamp were used to study T226M Nav 1.1 channels expressed in mammalian cells. Computational modeling was used to explore the neuronal scale mechanisms that account for altered action potential firing. RESULTS: T226M channels exhibited hyperpolarizing shifts of the activation and inactivation curves and enhanced fast inactivation. Dynamic action potential clamp hybrid simulation showed that model neurons containing T226M conductance displayed a left shift in rheobase relative to control. At current stimulation levels that produced repetitive action potential firing in control model neurons, depolarization block and cessation of action potential firing occurred in T226M model neurons. Fully computationally simulated neuron models recapitulated the findings from dynamic action potential clamp and showed that heterozygous T226M models were also more susceptible to depolarization block. ..."
62. Sensory feedback in an oscillatory interference model of place cell activity (Monaco et al. 2011)
Many animals use a form of dead reckoning known as 'path integration' to maintain a sense of their location as they explore the world. However, internal motion signals and the neural activity that integrates them can be noisy, leading inevitably to inaccurate position estimates. The rat hippocampus and entorhinal cortex support a flexible system of spatial representation that is critical to spatial learning and memory. The position signal encoded by this system is thought to rely on path integration, but it must be recalibrated by familiar landmarks to maintain accuracy. To explore the interaction between path integration and external landmarks, we present a model of hippocampal activity based on the interference of theta-frequency oscillations that are modulated by realistic animal movements around a track. We show that spatial activity degrades with noise, but introducing external cues based on direct sensory feedback can prevent this degradation. When these cues are put into conflict with each other, their interaction produces a diverse array of response changes that resembles experimental observations. Feedback driven by attending to landmarks may be critical to navigation and spatial memory in mammals.
63. Sensory-evoked responses of L5 pyramidal tract neurons (Egger et al 2020)
This is the L5 pyramidal tract neuron (L5PT) model from Egger, Narayanan et al., Neuron 2020. It allows investigating how synaptic inputs evoked by different sensory stimuli are integrated by the complex intrinsic properties of L5PTs. The model is constrained by anatomical measurements of the subcellular synaptic input patterns to L5PT neurons, in vivo measurements of sensory-evoked responses of different populations of neurons providing these synaptic inputs, and in vitro measurements constraining the biophysical properties of the soma, dendrites and axon (note: the biophysical model is based on the work by Hay et al., Plos Comp Biol 2011). The model files provided here allow performing simulations and analyses presented in Figures 3, 4 and 5.
64. Simulated cortical color opponent receptive fields self-organize via STDP (Eguchi et al., 2014)
"... In this work, we address the problem of understanding the cortical processing of color information with a possible mechanism of the development of the patchy distribution of color selectivity via computational modeling. ... Our model of the early visual system consists of multiple topographically-arranged layers of excitatory and inhibitory neurons, with sparse intra-layer connectivity and feed-forward connectivity between layers. Layers are arranged based on anatomy of early visual pathways, and include a retina, lateral geniculate nucleus, and layered neocortex. ... After training with natural images, the neurons display heightened sensitivity to specific colors. ..."
65. Single Trial Sequence learning: a spiking neurons model based on hippocampus (Coppolino et al 2021)
In contrast with our everyday experience using brain circuits, it can take a prohibitively long time to train a computational system to produce the correct sequence of outputs in the presence of a series of inputs. This suggests that something important is missing in the way in which models are trying to reproduce basic cognitive functions. In this work, we introduce a new neuronal network architecture that is able to learn, in a single trial, an arbitrary long sequence of any known objects. The key point of the model is the explicit use of mechanisms and circuitry observed in the hippocampus. By directly following the natural system’s layout and circuitry, this type of implementation has the additional advantage that the results can be more easily compared to experimental data, allowing a deeper and more direct understanding of the mechanisms underlying cognitive functions and dysfunctions.
66. Sparse connectivity is required for decorrelation, pattern separation (Cayco-Gajic et al 2017)
" ... To investigate the structural and functional determinants of pattern separation we built models of the cerebellar input layer with spatially correlated input patterns, and systematically varied their synaptic connectivity. ..."
67. Spike-Timing-Based Computation in Sound Localization (Goodman and Brette 2010)
" ... In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. ..."
68. Spontaneous weakly correlated excitation and inhibition (Tan et al. 2013)
Brian code for Tan et al. 2013.
69. STDP allows fast rate-modulated coding with Poisson-like spike trains (Gilson et al. 2011)
The model demonstrates that a neuron equipped with STDP robustly detects repeating rate patterns among its afferents, from which the spikes are generated on the fly using inhomogenous Poisson sampling, provided those rates have narrow temporal peaks (10-20ms) - a condition met by many experimental Post-Stimulus Time Histograms (PSTH).
70. Structure-dynamics relationships in bursting neuronal networks revealed (Mäki-Marttunen et al. 2013)
This entry includes tools for generating and analyzing network structure, and for running the neuronal network simulations on them.
71. Synaptic scaling balances learning in a spiking model of neocortex (Rowan & Neymotin 2013)
Learning in the brain requires complementary mechanisms: potentiation and activity-dependent homeostatic scaling. We introduce synaptic scaling to a biologically-realistic spiking model of neocortex which can learn changes in oscillatory rhythms using STDP, and show that scaling is necessary to balance both positive and negative changes in input from potentiation and atrophy. We discuss some of the issues that arise when considering synaptic scaling in such a model, and show that scaling regulates activity whilst allowing learning to remain unaltered.
72. Unsupervised learning of an efficient short-term memory network (Vertechi, Brendel & Machens 2014)
Learning in recurrent neural networks has been a topic fraught with difficulties and problems. We here report substantial progress in the unsupervised learning of recurrent networks that can keep track of an input signal. Specifically, we show how these networks can learn to efficiently represent their present and past inputs, based on local learning rules only.
73. Vertical System (VS) tangential cells network model (Trousdale et al. 2014)
Network model of the VS tangential cell system, with 10 cells per hemisphere. Each cell is a two compartment model with one compartment for dendrites and one for the axon. The cells are coupled through axonal gap junctions. The code allows to simulate responses of the VS network to a variety of visual stimuli to investigate coding as a function of gap junction strength.

Re-display model names without descriptions