Circuits that contain the Model Concept : Methods

(Research on numerical, mathematical, or computational neuroscience algorithms.)
Re-display model names without descriptions
    Models   Description
1. A detailed data-driven network model of prefrontal cortex (Hass et al 2016)
Data-based PFC-like circuit with layer 2/3 and 5, synaptic clustering, four types of interneurons and cell-type specific short-term synaptic plasticity; neuron parameters fitted to in vitro data, all other parameters constrained by experimental literature. Reproduces key features of in vivo resting state activity without specific tuning.
2. Activity constraints on stable neuronal or network parameters (Olypher and Calabrese 2007)
"In this study, we developed a general description of parameter combinations for which specified characteristics of neuronal or network activity are constant. Our approach is based on the implicit function theorem and is applicable to activity characteristics that smoothly depend on parameters. Such smoothness is often intrinsic to neuronal systems when they are in stable functional states. The conclusions about how parameters compensate each other, developed in this study, can thus be used even without regard to the specific mathematical model describing a particular neuron or neuronal network. ..."
3. Cell splitting in neural networks extends strong scaling (Hines et al. 2008)
Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with whole-cell balancing.
4. Collection of simulated data from a thalamocortical network model (Glabska, Chintaluri, Wojcik 2017)
"A major challenge in experimental data analysis is the validation of analytical methods in a fully controlled scenario where the justification of the interpretation can be made directly and not just by plausibility. ... One solution is to use simulations of realistic models to generate ground truth data. In neuroscience, creating such data requires plausible models of neural activity, access to high performance computers, expertise and time to prepare and run the simulations, and to process the output. To facilitate such validation tests of analytical methods we provide rich data sets including intracellular voltage traces, transmembrane currents, morphologies, and spike times. ... The data were generated using the largest publicly available multicompartmental model of thalamocortical network (Traub et al. 2005), with activity evoked by different thalamic stimuli."
5. Composite spiking network/neural field model of Parkinsons (Kerr et al 2013)
This code implements a composite model of Parkinson's disease (PD). The composite model consists of a leaky integrate-and-fire spiking neuronal network model being driven by output from a neural field model (instead of the more usual white noise drive). Three different sets of parameters were used for the field model: one with basal ganglia parameters based on data from healthy individuals, one based on data from individuals with PD, and one purely thalamocortical model. The aim of this model is to explore how the different dynamical patterns in each each of these field models affects the activity in the network model.
6. Connection-set Algebra (CSA) for the representation of connectivity in NN models (Djurfeldt 2012)
"The connection-set algebra (CSA) is a novel and general formalism for the description of connectivity in neuronal network models, from small-scale to large-scale structure. ... The expressiveness of CSA makes prototyping of network structure easy. A C++ version of the algebra has been implemented and used in a large-scale neuronal network simulation (Djurfeldt et al., IBM J Res Dev 52(1/2):31–42, 2008b) and an implementation in Python has been publicly released."
7. Efficient simulation environment for modeling large-scale cortical processing (Richert et al. 2011)
"We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. ..."
8. Fast population coding (Huys et al. 2007)
"Uncertainty coming from the noise in its neurons and the ill-posed nature of many tasks plagues neural computations. Maybe surprisingly, many studies show that the brain manipulates these forms of uncertainty in a probabilistically consistent and normative manner, and there is now a rich theoretical literature on the capabilities of populations of neurons to implement computations in the face of uncertainty. However, one major facet of uncertainty has received comparatively little attention: time. In a dynamic, rapidly changing world, data are only temporarily relevant. Here, we analyze the computational consequences of encoding stimulus trajectories in populations of neurons. ..."
9. Fully-Asynchronous Cache-Efficient Simulation of Detailed Neural Networks (Magalhaes et al 2019)
"Modern asynchronous runtime systems allow the re-thinking of large-scale scientific applications. With the example of a simulator of morphologically detailed neural networks, we show how detaching from the commonly used bulk-synchronous parallel (BSP) execution allows for the increase of prefetching capabilities, better cache locality, and a overlap of computation and communication, consequently leading to a lower time to solution. Our strategy removes the operation of collective synchronization of ODEs’ coupling information, and takes advantage of the pairwise time dependency between equations, leading to a fully-asynchronous exhaustive yet not speculative stepping model. Combined with fully linear data structures, communication reduce at compute node level, and an earliest equation steps first scheduler, we perform an acceleration at the cache level that reduces communication and time to solution by maximizing the number of timesteps taken per neuron at each iteration. Our methods were implemented on the core kernel of the NEURON scientific application. Asynchronicity and distributed memory space are provided by the HPX runtime system for the ParalleX execution model. Benchmark results demonstrate a superlinear speed-up that leads to a reduced runtime compared to the bulk synchronous execution, yielding a speed-up between 25% to 65% across different compute architectures, and in the order of 15% to 40% for distributed executions."
10. GLMCC validation neural network model (Kobayashi et al. 2019)
Network model of two populations of randomly connected inhibitory and excitatory neurons to validate method for reconstructing the neural circuitry developed in "Reconstructing Neuronal Circuitry from Parallel Spike Trains" by Ryota Kobayashi, Shuhei Kurita, Anno Kurth, Katsunori Kitano, Kenji Mizuseki, Markus Diesmann, Barry J. Richmond and Shigeru Shinomoto.
11. Graph-theoretical Derivation of Brain Structural Connectivity (Giacopelli et al 2020)
Brain connectivity at the single neuron level can provide fundamental insights into how information is integrated and propagated within and between brain regions. However, it is almost impossible to adequately study this problem experimentally and, despite intense efforts in the field, no mathematical description has been obtained so far. Here, we present a mathematical framework based on a graph-theoretical approach that, starting from experimental data obtained from a few small subsets of neurons, can quantitatively explain and predict the corresponding full network properties. This model also changes the paradigm with which large-scale model networks can be built, from using probabilistic/empiric connections or limited data, to a process that can algorithmically generate neuronal networks connected as in the real system.
12. Hippocampal CA1 NN with spontaneous theta, gamma: full scale & network clamp (Bezaire et al 2016)
This model is a full-scale, biologically constrained rodent hippocampal CA1 network model that includes 9 cells types (pyramidal cells and 8 interneurons) with realistic proportions of each and realistic connectivity between the cells. In addition, the model receives realistic numbers of afferents from artificial cells representing hippocampal CA3 and entorhinal cortical layer III. The model is fully scaleable and parallelized so that it can be run at small scale on a personal computer or large scale on a supercomputer. The model network exhibits spontaneous theta and gamma rhythms without any rhythmic input. The model network can be perturbed in a variety of ways to better study the mechanisms of CA1 network dynamics. Also see online code at and further information at
13. KInNeSS : a modular framework for computational neuroscience (Versace et al. 2008)
The xml files provided here implement a network of excitatory and inhibitory spiking neurons, governed by either Hodgkin-Huxley or quadratic integrate-and-fire dynamical equations. The code is used to demonstrate the capabilities of the KInNeSS software package for simulation of networks of spiking neurons. The simulation protocol used here is meant to facilitate the comparison of KInNeSS with other simulators reviewed in <a href="">Brette et al. (2007)</a>. See the associated paper "Versace et al. (2008) KInNeSS : a modular framework for computational neuroscience." for an extensive description of KInNeSS .
14. Large scale neocortical model for PGENESIS (Crone et al 2019)
This is model code for a large scale neocortical model based on Traub et al. (2005), modified to run on PGENESIS on supercomputing resources. "In this paper (Crone et al 2019), we evaluate the computational performance of the GEneral NEural SImulation System (GENESIS) for large scale simulations of neural networks. While many benchmark studies have been performed for large scale simulations with leaky integrate-and-fire neurons or neuronal models with only a few compartments, this work focuses on higher fidelity neuronal models represented by 50–74 compartments per neuron. ..."
15. Mean Field Equations for Two-Dimensional Integrate and Fire Models (Nicola and Campbell, 2013)
The zip file contains the files used to perform numerical simulation and bifurcation studies of large networks of two-dimensional integrate and fire neurons and of the corresponding mean field models derived in our paper. The neural models used are the Izhikevich model and the Adaptive Exponential model.
16. Motion Clouds: Synthesis of random textures for motion perception (Leon et al. 2012)
We describe a framework to generate random texture movies with controlled information content. In particular, these stimuli can be made closer to naturalistic textures compared to usual stimuli such as gratings and random-dot kinetograms. We simplified the definition to parametrically define these "Motion Clouds" around the most prevalent feature axis (mean and bandwith): direction, spatial frequency, orientation.
17. NETMORPH: creates NNs with realistic neuron morphologies (Koene et al. 2009, van Ooyen et al. 2014)
NETMORPH is a simulation tool for building synaptically connected networks with realistic neuron morphologies. Axonal and dendritic morphologies are created by using stochastic rules for the behavior of individual growth cones, the structures at the tip of outgrowing axons and dendrites that mediate elongation and branching. Axons and dendrites are not guided by any extracellular cues. Synapses are formed when crossing axonal and dendritic segments come sufficiently close to each other. See the README in the archive for more information.
18. Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007)
This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005).
19. Neural Interactome: interactive simulation of a neuronal system (Kim et al 2019)
""Connectivity and biophysical processes determine the functionality of neuronal networks. We, therefore, developed a real-time framework, called Neural Interactome, to simultaneously visualize and interact with the structure and dynamics of such networks. Neural Interactome is a cross-platform framework, which combines graph visualization with the simulation of neural dynamics, or experimentally recorded multi neural time series, to allow application of stimuli to neurons to examine network responses. In addition, Neural Interactome supports structural changes, such as disconnection of neurons from the network (ablation feature). Neural dynamics can be explored on a single neuron level (using a zoom feature), back in time (using a review feature), and recorded (using presets feature). The development of the Neural Interactome was guided by generic concepts to be applicable to neuronal networks with different neural connectivity and dynamics. We implement the framework using a model of the nervous system of Caenorhabditis elegans (C. elegans) nematode, a model organism with resolved connectome and neural dynamics. We show that Neural Interactome assists in studying neural response patterns associated with locomotion and other stimuli. In particular, we demonstrate how stimulation and ablation help in identifying neurons that shape particular dynamics. We examine scenarios that were experimentally studied, such as touch response circuit, and explore new scenarios that did not undergo elaborate experimental studies."
20. Neural mass model based on single cell dynamics to model pathophysiology (Zandt et al 2014)
The model code as described in "A neural mass model based on single cell dynamics to model pathophysiology, Zandt et al. 2014, Journal of Computational Neuroscience" A Neural mass model (NMM) derived from single cell dynamics in a bottom up approach. Mean and standard deviation of the firing rates in the populations are calculated. The sigmoid is derived from the single cell FI-curve, allowing for easy implementation of pathological conditions. NMM is compared with a detailed spiking network model consisting of HH neurons. NMM code in Matlab. The network model is simulated using Norns (ModelDB # 154739)
21. Neuron-based control mechanisms for a robotic arm and hand (Singh et al 2017)
"A robotic arm and hand controlled by simulated neurons is presented. The robot makes use of a biological neuron simulator using a point neural model. ... The robot performs a simple pick-and-place task. ... As another benefit, it is hoped that further work will also lead to a better understanding of human and other animal neural processing, particularly for physical motion. This is a multidisciplinary approach combining cognitive neuroscience, robotics, and psychology."
22. Norns - Neural Network Studio (Visser & Van Gils 2014)
The Norns - Neural Network Studio is a software package for designing, simulation and analyzing networks of spiking neurons. It consists of three parts: 1. "Urd": a Matlab frontend with high-level functions for quickly defining networks 2. "Verdandi": an optimized C++ simulation environment which runs the simulation defined by Urd 3. "Skuld": an advanced Matlab graphical user interface (GUI) for visual inspection of simulated data.
23. Numerical Integration of Izhikevich and HH model neurons (Stewart and Bair 2009)
The Parker-Sochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. We apply the Parker-Sochacki method to the Izhikevich ‘simple’ model and a Hodgkin-Huxley type neuron, comparing the results with those obtained using the Runge-Kutta and Bulirsch-Stoer methods.
24. Parallel network simulations with NEURON (Migliore et al 2006)
The NEURON simulation environment has been extended to support parallel network simulations. The performance of three published network models with very different spike patterns exhibits superlinear speedup on Beowulf clusters.
25. Parallelizing large networks in NEURON (Lytton et al. 2016)
"Large multiscale neuronal network simulations and innovative neurotechnologies are required for development of these models requires development of new simulation technologies. We describe here the current use of the NEURON simulator with MPI (message passing interface) for simulation in the domain of moderately large networks on commonly available High Performance Computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike passing paradigm and post-simulation data storage and data management approaches. We also compare three types of networks, ..."
26. Quantitative assessment of computational models for retinotopic map formation (Hjorth et al. 2015)
"Molecular and activity-based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. ..."
27. Response properties of neocort. neurons to temporally modulated noisy inputs (Koendgen et al. 2008)
Neocortical neurons are classified by current–frequency relationship. This is a static description and it may be inadequate to interpret neuronal responses to time-varying stimuli. Theoretical studies (Brunel et al., 2001; Fourcaud-Trocmé et al. 2003; Fourcaud-Trocmé and Brunel 2005; Naundorf et al. 2005) suggested that single-cell dynamical response properties are necessary to interpret ensemble responses to fast input transients. Further, it was shown that input-noise linearizes and boosts the response bandwidth, and that the interplay between the barrage of noisy synaptic currents and the spike-initiation mechanisms determine the dynamical properties of the firing rate. In order to allow a reader to explore such simulations, we prepared a simple NEURON implementation of the experiments performed in Köndgen et al., 2008 (see also Fourcaud-Trocmé al. 2003; Fourcaud-Trocmé and Brunel 2005). In addition, we provide sample MATLAB routines for exploring the sandwich model proposed in Köndgen et al., 2008, employing a simple frequdency-domain filtering. The simulations and the MATLAB routines are based on the linear response properties of layer 5 pyramidal cells estimated by injecting a superposition of a small-amplitude sinusoidal wave and a background noise, as in Köndgen et al., 2008.
28. Single neuron properties shape chaos and signal transmission in random NNs (Muscinelli et al 2019)
"While most models of randomly connected neural networks assume single-neuron models with simple dynamics, neurons in the brain exhibit complex intrinsic dynamics over multiple timescales. We analyze how the dynamical properties of single neurons and recurrent connections interact to shape the effective dynamics in large randomly connected networks. A novel dynamical mean-field theory for strongly connected networks of multi-dimensional rate neurons shows that the power spectrum of the network activity in the chaotic phase emerges from a nonlinear sharpening of the frequency response function of single neurons. For the case of two-dimensional rate neurons with strong adaptation, we find that the network exhibits a state of “resonant chaos”, characterized by robust, narrow-band stochastic oscillations. The coherence of stochastic oscillations is maximal at the onset of chaos and their correlation time scales with the adaptation timescale of single units. Surprisingly, the resonance frequency can be predicted from the properties of isolated neurons, even in the presence of heterogeneity in the adaptation parameters. In the presence of these internally-generated chaotic fluctuations, the transmission of weak, low-frequency signals is strongly enhanced by adaptation, whereas signal transmission is not influenced by adaptation in the non-chaotic regime. Our theoretical framework can be applied to other mechanisms at the level of single neurons, such as synaptic filtering, refractoriness or spike synchronization. These results advance our understanding of the interaction between the dynamics of single units and recurrent connectivity, which is a fundamental step toward the description of biologically realistic neural networks."
29. Spike exchange methods for a Blue Gene/P supercomputer (Hines et al., 2011)
Tests several spike exchange methods on a Blue Gene/P supercomputer on up to 64K cores.
30. Structure-dynamics relationships in bursting neuronal networks revealed (Mäki-Marttunen et al. 2013)
This entry includes tools for generating and analyzing network structure, and for running the neuronal network simulations on them.
31. The microcircuits of striatum in silico (Hjorth et al 2020)
"Our aim is to reconstruct a full-scale mouse striatal cellular level model to provide a framework to integrate and interpret striatal data. We represent the main striatal neuronal subtypes, the two types of projection neurons (dSPNs and iSPNs) giving rise to the direct and indirect pathways, the fast-spiking interneurons, the low threshold spiking interneurons, and the cholinergic interneurons as detailed compartmental models, with properties close to their biological counterparts. Both intrastriatal and afferent synaptic inputs (cortex, thalamus, dopamine system) are optimized against existing data, including short-term plasticity. This model platform will be used to generate new hypotheses on striatal function or network dynamic phenomena."
32. Translating network models to parallel hardware in NEURON (Hines and Carnevale 2008)
Shows how to move a working network model written in NEURON from a serial processor to a parallel machine in such a way that the final result will produce numerically identical results on either serial or parallel hardware.

Re-display model names without descriptions