Models | Description | |

1. | 3D-printer visualization of NEURON models (McDougal and Shepherd, 2015) | |

"... We introduce the use of 3D printing as a technique for visualizing traced morphologies. Our method for generating printable versions of a cell or group of cells is to expand dendrite and axon diameters and then to transform the tracing into a 3D object with a neuronal surface generating algorithm like Constructive Tessellated Neuronal Geometry (CTNG). ..." | ||

2. | A comparative computer simulation of dendritic morphology (Donohue and Ascoli 2008) | |

Morphological aspects of dendritic branching such branch lengths, taper rates,ratios of daughter radii, and bifurcation probabilities are measured from real cells. These morphometrics are then resampled to create virtual trees based on the current branch order, radius, path distance to the soma, or combination of the three. | ||

3. | A CORF computational model of a simple cell that relies on LGN input (Azzopardi & Petkov 2012) | |

"... We propose a computational model that uses as afferent inputs the responses of model LGN cells with center-surround receptive fields (RFs) and we refer to it as a Combination of Receptive Fields (CORF) model. We use shifted gratings as test stimuli and simulated reverse correlation to explore the nature of the proposed model. We study its behavior regarding the effect of contrast on its response and orientation bandwidth as well as the effect of an orthogonal mask on the response to an optimally oriented stimulus. We also evaluate and compare the performances of the CORF and GF (Gabor Filter) models regarding contour detection, using two public data sets of images of natural scenes with associated contour ground truths. ... The proposed CORF model is more realistic than the GF model and is more effective in contour detection, which is assumed to be the primary biological role of simple cells." | ||

4. | A detailed data-driven network model of prefrontal cortex (Hass et al 2016) | |

Data-based PFC-like circuit with layer 2/3 and 5, synaptic clustering, four types of interneurons and cell-type specific short-term synaptic plasticity; neuron parameters fitted to in vitro data, all other parameters constrained by experimental literature. Reproduces key features of in vivo resting state activity without specific tuning. | ||

5. | A fast model of voltage-dependent NMDA Receptors (Moradi et al. 2013) | |

These are two or triple-exponential models of the voltage-dependent NMDA receptors. Conductance of these receptors increase voltage-dependently with a "Hodgkin and Huxley-type" gating style that is also depending on glutamate-binding. Time course of the gating of these receptors in response to glutamate are also changing voltage-dependently. Temperature sensitivity and desensitization of these receptor are also taken into account. Three previous kinetic models that are able to simulate the voltage-dependence of the NMDARs are also imported to the NMODL. These models are not temperature sensitive. These models are compatible with the "event delivery system" of NEURON. Parameters that are reported in our paper are applicable to CA1 pyramidal cell dendrites. | ||

6. | A finite volume method for stochastic integrate-and-fire models (Marpeau et al. 2009) | |

"The stochastic integrate and fire neuron is one of the most commonly used stochastic models in neuroscience. Although some cases are analytically tractable, a full analysis typically calls for numerical simulations. We present a fast and accurate finite volume method to approximate the solution of the associated Fokker-Planck equation. ..." | ||

7. | A generic MAPK cascade model for random parameter sampling analysis (Mai and Liu 2013) | |

A generic three-tier MAPK cascade model constructed by comparing previous MAPK models covering a range of biosystems. Pseudo parameters and random sampling were employed for qualitative analysis. A range of kinetic behaviors of MAPK activation, including ultrasensitivity, bistability, transient activation and oscillation, were successfully reproduced in this generic model. The mechanisms were revealed by statistic analysis of the parameter sets. | ||

8. | A set of reduced models of layer 5 pyramidal neurons (Bahl et al. 2012) | |

These are the NEURON files for 10 different models of a reduced L5 pyramidal neuron. The parameters were obtained by automatically fitting the models to experimental data using a multi objective evolutionary search strategy. Details on the algorithm can be found at <a href="http://www.g-node.org/emoo">www.g-node.org/emoo</a> and in Bahl et al. (2012). | ||

9. | A simplified cerebellar Purkinje neuron (the PPR model) (Brown et al. 2011) | |

These models were implemented in NEURON by Sherry-Ann Brown in the laboratory of Leslie M. Loew. The files reproduce Figures 2c-f from Brown et al, 2011 "Virtual NEURON: a Strategy For Merged Biochemical and Electrophysiological Modeling". | ||

10. | Accurate and fast simulation of channel noise in conductance-based model neurons (Linaro et al 2011) | |

We introduce and operatively present a general method to simulate channel noise in conductance-based model neurons, with modest computational overheads. Our approach may be considered as an accurate generalization of previous proposal methods, to the case of voltage-, ion-, and ligand-gated channels with arbitrary complexity. We focus on the discrete Markov process descriptions, routinely employed in experimental identification of voltage-gated channels and synaptic receptors. | ||

11. | Activity constraints on stable neuronal or network parameters (Olypher and Calabrese 2007) | |

"In this study, we developed a general description of parameter combinations for which specified characteristics of neuronal or network activity are constant. Our approach is based on the implicit function theorem and is applicable to activity characteristics that smoothly depend on parameters. Such smoothness is often intrinsic to neuronal systems when they are in stable functional states. The conclusions about how parameters compensate each other, developed in this study, can thus be used even without regard to the specific mathematical model describing a particular neuron or neuronal network. ..." | ||

12. | Allosteric gating of K channels (Horrigan et al 1999) | |

Calcium sensitive large-conductance K channel conductance is controlled by both cytoplasmic calcium and membrane potential. Experimental data obtained by the inside out patch method can be understood in terms of a gating scheme where a central transition between a closed and an open conformation is allosterically regulated by the state of four independent and identical voltage sensors. See paper for more and details. | ||

13. | Analytical modelling of temperature effects on an AMPA-type synapse (Kufel & Wojcik 2018) | |

This code was used in the construction of the model developed in the paper. It is a modified version of the simulation developed by Postlethwaite et al. 2007 - for details of modifications refer to the main body of Kufel & Wojcik (2018). | ||

14. | Analyzing neural time series data theory and practice (Cohen 2014) | |

"This book offers a comprehensive guide to the theory and practice of analyzing electrical brain signals. It explains the conceptual, mathematical, and implementational (via Matlab programming) aspects of time-, time-frequency- and synchronization-based analyses of magnetoencephalography (MEG), electroencephalography (EEG), and local field potential (LFP) recordings from humans and nonhuman animals." | ||

15. | AP shape and parameter constraints in optimization of compartment models (Weaver and Wearne 2006) | |

"... We construct an objective function that includes both time-aligned action potential shape error and errors in firing rate and firing regularity. We then implement a variant of simulated annealing that introduces a recentering algorithm to handle infeasible points outside the boundary constraints. We show how our objective function captures essential features of neuronal firing patterns, and why our boundary management technique is superior to previous approaches." | ||

16. | Boolean network-based analysis of the apoptosis network (Mai and Liu 2009) | |

"To understand the design principles of the molecular interaction network associated with the irreversibility of cell apoptosis and the stability of cell surviving, we constructed a Boolean network integrating both the intrinsic and extrinsic pro-apoptotic pathways with pro-survival signal transduction pathways. We performed statistical analyses of the dependences of cell fate on initial states and on input signals. The analyses reproduced the well-known pro- and anti-apoptotic effects of key external signals and network components. We found that the external GF signal by itself did not change the apoptotic ratio from randomly chosen initial states when there is no external TNF signal, but can significantly offset apoptosis induced by the TNF signal. ..." | ||

17. | Brain Dynamics Toolbox (Heitmann & Breakspear 2016, 2017, 2018) | |

"The Brain Dynamics Toolbox is open-source software for simulating dynamical systems in neuroscience. It is for researchers and students who wish to explore mathematical models of brain function using Matlab. It includes a graphical tool for simulating dynamical systems in real-time as well as command-line tools for scripting large-scale simulations." | ||

18. | Brain networks simulators - a comparative study (Tikidji-Hamburyan et al 2017) | |

" ... In this article, we select the three most popular simulators, as determined by the number of models in the ModelDB database, such as NEURON, GENESIS, and BRIAN, and perform an independent evaluation of these simulators. In addition, we study NEST, one of the lead simulators of the Human Brain Project. First, we study them based on one of the most important characteristics, the range of supported models. Our investigation reveals that brain network simulators may be biased toward supporting a specific set of models. ... we carry out an evaluation using two case studies: a large network with simplified neural and synaptic models and a small network with detailed models. These two case studies allow us to avoid any bias toward a particular software package ..." | ||

19. | Cell splitting in neural networks extends strong scaling (Hines et al. 2008) | |

Neuron tree topology equations can be split into two subtrees and solved on different processors with no change in accuracy, stability, or computational effort; communication costs involve only sending and receiving two double precision values by each subtree at each time step. Application of the cell splitting method to two published network models exhibits good runtime scaling on twice as many processors as could be effectively used with whole-cell balancing. | ||

20. | Channel parameter estimation from current clamp and neuronal properties (Toth, Crunelli 2001) | |

In this paper, we present a method by which the activation and kinetic properties of INa, IK can be estimated from current-clamp data, more precisely from the time course of the action potential, provided some additional electrophysiological properties of the neurone are a priori known. See reference for details and more. | ||

21. | Code to calc. spike-trig. ave (STA) conduct. from Vm (Pospischil et al. 2007, Rudolph et al. 2007) | |

PYTHON code to calculate spike-triggered average (STA) conductances from intracellular recordings, according to the method published by Pospischil et al., J Neurophysiol, 2007. The method consists of a maximum likelihood estimate of the conductance STA, from the voltage STA (which is calculated from the data). The method was tested using models and dynamic-clamp experiments; for details, see the original publication (Pospischil et al., 2007). The first application of this method to experimental data was from intracellular recordings in awake cat cerebral cortex (Rudolph et al., 2007). | ||

22. | Collection of simulated data from a thalamocortical network model (Glabska, Chintaluri, Wojcik 2017) | |

"A major challenge in experimental data analysis is the validation of analytical methods in a fully controlled scenario where the justification of the interpretation can be made directly and not just by plausibility. ... One solution is to use simulations of realistic models to generate ground truth data. In neuroscience, creating such data requires plausible models of neural activity, access to high performance computers, expertise and time to prepare and run the simulations, and to process the output. To facilitate such validation tests of analytical methods we provide rich data sets including intracellular voltage traces, transmembrane currents, morphologies, and spike times. ... The data were generated using the largest publicly available multicompartmental model of thalamocortical network (Traub et al. 2005), with activity evoked by different thalamic stimuli." | ||

23. | Comparison of full and reduced globus pallidus models (Hendrickson 2010) | |

In this paper, we studied what features of realistic full model activity patterns can and cannot be preserved by morphologically reduced models. To this end, we reduced the morphological complexity of a full globus pallidus neuron model possessing active dendrites and compared its spontaneous and driven responses to those of the reduced models. | ||

24. | Composite spiking network/neural field model of Parkinsons (Kerr et al 2013) | |

This code implements a composite model of Parkinson's disease (PD). The composite model consists of a leaky integrate-and-fire spiking neuronal network model being driven by output from a neural field model (instead of the more usual white noise drive). Three different sets of parameters were used for the field model: one with basal ganglia parameters based on data from healthy individuals, one based on data from individuals with PD, and one purely thalamocortical model. The aim of this model is to explore how the different dynamical patterns in each each of these field models affects the activity in the network model. | ||

25. | Connection-set Algebra (CSA) for the representation of connectivity in NN models (Djurfeldt 2012) | |

"The connection-set algebra (CSA) is a novel and general formalism for the description of connectivity in neuronal network models, from small-scale to large-scale structure. ... The expressiveness of CSA makes prototyping of network structure easy. A C++ version of the algebra has been implemented and used in a large-scale neuronal network simulation (Djurfeldt et al., IBM J Res Dev 52(1/2):31–42, 2008b) and an implementation in Python has been publicly released." | ||

26. | Constructed Tessellated Neuronal Geometries (CTNG) (McDougal et al. 2013) | |

We present an algorithm to form watertight 3D surfaces consistent with the point-and-diameter based neuronal morphology descriptions widely used with spatial electrophysiology simulators. ... This (point-and-diameter) representation is well-suited for electrophysiology simulations, where the space constants are larger than geometric ambiguities. However, the simple interpretations used for pure electrophysiological simulation produce geometries unsuitable for multi-scale models that also involve three-dimensional reaction–diffusion, as such models have smaller space constants. ... Although one cannot exactly reproduce an original neuron's full shape from point-and-diameter data, our new constructive tessellated neuronal geometry (CTNG) algorithm uses constructive solid geometry to define a plausible reconstruction without gaps or cul-de-sacs. CTNG then uses “constructive cubes” to produce a watertight triangular mesh of the neuron surface, suitable for use in reaction–diffusion simulations. ..." | ||

27. | Data-driven, HH-type model of the lateral pyloric (LP) cell in the STG (Nowotny et al. 2008) | |

This model was developed using voltage clamp data and existing LP models to assemble an initial set of currents which were then adjusted by extensive fitting to a long data set of an isolated LP neuron. The main points of the work are a) automatic fitting is difficult but works when the method is carefully adjusted to the problem (and the initial guess is good enough). b) The resulting model (in this case) made reasonable predictions for manipulations not included in the original data set, e.g., blocking some of the ionic currents. c) The model is reasonably robust against changes in parameters but the different parameters vary a lot in this respect. d) The model is suitable for use in a network and has been used for this purpose (Ivanchenko et al. 2008) | ||

28. | Detailed analysis of trajectories in the Morris water maze (Gehring et al. 2015) | |

MATLAB code that can be used for detailed behavioural analyzes of the trajectories of animals be means of a semi-supervised clustering algorithm. The method is applied here to trajectories in the Morris Water Maze (see Gehring, T. V. et al., Scientific Reports, 2015) but the code can easily be adapted to other types experiments. For more information and the latest version of the code please refer to https://bitbucket.org/tiagogehring/mwm_trajectories | ||

29. | Dipole Localization Kit (Mechler & Victor, 2012) | |

We localize a single neuron from the spatial sample of its EAP amplitudes recorded with a multisite probe (with 6 or more independent measurement sites or channels, e.g., a silicon polytrode, a stepped tetrode, etc.) This is an inverse problem and we solve it by fitting a model to the EAPs that consists of a volume conductor model of the neural tissue (known), a realistic model of the probe (known), and a single dipole current source of the model neuron (unknown). The dipole is free to change position, size, and orientation (a total of 6 parameters) at each moment during the action potential. | ||

30. | Discrete event simulation in the NEURON environment (Hines and Carnevale 2004) | |

A short introduction to how "integrate and fire" cells are implemented in NEURON. Network simulations that use only artificial spiking cells are extremely efficient, with runtimes proportional to the total number of synaptic inputs received and independent of the number of cells or problem time. | ||

31. | Distinct current modules shape cellular dynamics in model neurons (Alturki et al 2016) | |

" ... We hypothesized that currents are grouped into distinct modules that shape specific neuronal characteristics or signatures, such as resting potential, sub-threshold oscillations, and spiking waveforms, for several classes of neurons. For such a grouping to occur, the currents within one module should have minimal functional interference with currents belonging to other modules. This condition is satisfied if the gating functions of currents in the same module are grouped together on the voltage axis; in contrast, such functions are segregated along the voltage axis for currents belonging to different modules. We tested this hypothesis using four published example case models and found it to be valid for these classes of neurons. ..." | ||

32. | Distributed computing tool for NEURON, NEURONPM (screensaver) (Calin-Jageman and Katz 2006) | |

"... To lower the barrier for large-scale model analysis, we have developedNeuronPM, a client/server application that creates a “screen-saver” cluster for running simulations in NEURON (Hines & Carnevale, 1997). ... The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. ... Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net. ..." | ||

33. | DynaSim: a MATLAB toolbox for neural modeling and simulation (Sherfey et al 2018) | |

"DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems...." | ||

34. | Efficient estimation of detailed single-neuron models (Huys et al. 2006) | |

"Biophysically accurate multicompartmental models of individual neurons ... depend on a large number of parameters that are difficult to estimate. ... We propose a statistical approach to the automatic estimation of various biologically relevant parameters, including 1) the distribution of channel densities, 2) the spatiotemporal pattern of synaptic input, and 3) axial resistances across extended dendrites. ... We demonstrate that the method leads to accurate estimations on a wide variety of challenging model data sets that include up to about 10,000 parameters (roughly two orders of magnitude more than previously feasible) and describe how the method gives insights into the functional interaction of groups of channels." | ||

35. | Efficient simulation environment for modeling large-scale cortical processing (Richert et al. 2011) | |

"We have developed a spiking neural network simulator, which is both easy to use and computationally efficient, for the generation of large-scale computational neuroscience models. The simulator implements current or conductance based Izhikevich neuron networks, having spike-timing dependent plasticity and short-term plasticity. ..." | ||

36. | Electrodiffusive astrocytic and extracellular ion concentration dynamics model (Halnes et al. 2013) | |

An electrodiffusive formalism was developed for computing the dynamics of the membrane potential and ion concentrations in the intra- and extracellular space in a one-dimensional geometry (cable). This (general) formalism was implemented in a model of astrocytes exchanging K+, Na+ and Cl- ions with the extracellular space (ECS). A limited region (0< x<l/10 where l is the astrocyte length) of the ECS was exposed to an increase in the local K+ concentration. The model is used to explore how astrocytes contribute in transporting K+ out from high-concentration regions via a mechanism known as spatial buffering, which involves local uptake from high concentration regions, intracellular transport, and release of K+ in regions with lower ECS concentrations. | ||

37. | Evaluation of stochastic diff. eq. approximation of ion channel gating models (Bruce 2009) | |

Fox and Lu derived an algorithm based on stochastic differential equations for approximating the kinetics of ion channel gating that is simpler and faster than "exact" algorithms for simulating Markov process models of channel gating. However, the approximation may not be sufficiently accurate to predict statistics of action potential generation in some cases. The objective of this study was to develop a framework for analyzing the inaccuracies and determining their origin. Simulations of a patch of membrane with voltage-gated sodium and potassium channels were performed using an exact algorithm for the kinetics of channel gating and the approximate algorithm of Fox & Lu. ... The results indicate that: (i) the source of the inaccuracy is that the Fox & Lu algorithm does not adequately describe the combined behavior of the multiple activation particles in each sodium and potassium channel, and (ii) the accuracy does not improve with increasing numbers of channels. | ||

38. | Extracellular fields for a three-dimensional network of cells using NEURON (Appukuttan et al 2017) | |

" ... In the present work, we demonstrate a technique to couple the extracellular fields of individual cells within the NEURON simulation environment. The existing features of the simulator are extended by explicitly defining current balance equations, resulting in the coupling of the extracellular fields of adjacent cells. ..." | ||

39. | Fast population coding (Huys et al. 2007) | |

"Uncertainty coming from the noise in its neurons and the ill-posed nature of many tasks plagues neural computations. Maybe surprisingly, many studies show that the brain manipulates these forms of uncertainty in a probabilistically consistent and normative manner, and there is now a rich theoretical literature on the capabilities of populations of neurons to implement computations in the face of uncertainty. However, one major facet of uncertainty has received comparatively little attention: time. In a dynamic, rapidly changing world, data are only temporarily relevant. Here, we analyze the computational consequences of encoding stimulus trajectories in populations of neurons. ..." | ||

40. | Fully Implicit Parallel Simulation of Single Neurons (Hines et al. 2008) | |

A 3-d reconstructed neuron model can be simulated in parallel on a dozen or so processors and experience almost linear speedup. Network models can be simulated when there are more processors than cells. | ||

41. | GC model (Beining et al 2017) | |

A companion modeldb entry (NEURON only) to modeldb accession number 231862. | ||

42. | Generating neuron geometries for detailed 3D simulations using AnaMorph (Morschel et al 2017) | |

"Generating realistic and complex computational domains for numerical simulations is often a challenging task. In neuroscientific research, more and more one-dimensional morphology data is becoming publicly available through databases. This data, however, only contains point and diameter information not suitable for detailed three-dimensional simulations. In this paper, we present a novel framework, AnaMorph, that automatically generates water-tight surface meshes from one-dimensional point-diameter files. These surface triangulations can be used to simulate the electrical and biochemical behavior of the underlying cell. ..." | ||

43. | Generic Bi-directional Real-time Neural Interface (Zrenner et al. 2010) | |

Matlab/Simulink toolkit for generic multi-channel short-latency bi-directional neural-computer interactions. High-bandwidth (> 10 megabit per second) neural recording data can be analyzed in real-time while simultaneously generating specific complex electrical stimulation feedback with deterministically timed responses at sub-millisecond resolution. The commercially available 60-channel extracellular multi-electrode recording and stimulation set-up (Multichannelsystems GmbH MEA60) is used as an example hardware implementation. | ||

44. | Globus pallidus multi-compartmental model neuron with realistic morphology (Gunay et al. 2008) | |

"Globus pallidus (GP) neurons recorded in brain slices show significant variability in intrinsic electrophysiological properties. To investigate how this variability arises, we manipulated the biophysical properties of GP neurons using computer simulations. ... Our results indicated that most of the experimental variability could be matched by varying conductance densities, which we confirmed with additional partial block experiments. Further analysis resulted in two key observations: (1) each voltage-gated conductance had effects on multiple measures such as action potential waveform and spontaneous or stimulated spike rates; and (2) the effect of each conductance was highly dependent on the background context of other conductances present. In some cases, such interactions could reverse the effect of the density of one conductance on important excitability measures. ..." | ||

45. | High-Res. Recordings Using a Real-Time Computational Model of the Electrode (Brette et al. 2008) | |

"Intracellular recordings of neuronal membrane potential are a central tool in neurophysiology. ... We introduce a computer-aided technique, Active Electrode Compensation (AEC), based on a digital model of the electrode interfaced in real time with the electrophysiological setup. ... AEC should be particularly useful to characterize fast neuronal phenomena intracellularly in vivo." | ||

46. | Hippocampal CA1 NN with spontaneous theta, gamma: full scale & network clamp (Bezaire et al 2016) | |

This model is a full-scale, biologically constrained rodent hippocampal CA1 network model that includes 9 cells types (pyramidal cells and 8 interneurons) with realistic proportions of each and realistic connectivity between the cells. In addition, the model receives realistic numbers of afferents from artificial cells representing hippocampal CA3 and entorhinal cortical layer III. The model is fully scaleable and parallelized so that it can be run at small scale on a personal computer or large scale on a supercomputer. The model network exhibits spontaneous theta and gamma rhythms without any rhythmic input. The model network can be perturbed in a variety of ways to better study the mechanisms of CA1 network dynamics. Also see online code at http://bitbucket.org/mbezaire/ca1 and further information at http://mariannebezaire.com/models/ca1 | ||

47. | Impact of dendritic size and topology on pyramidal cell burst firing (van Elburg and van Ooyen 2010) | |

The code provided here was written to systematically investigate which of the physical parameters controlled by dendritic morphology underlies the differences in spiking behaviour observed in different realizations of the 'ping-pong'-model. Structurally varying dendritic topology and length in a simplified model allows us to separate out the physical parameters derived from morphology underlying burst firing. To perform the parameter scans we created a new NEURON tool the MultipleRunControl which can be used to easily set up a parameter scan and write the simulation results to file. Using this code we found that not input conductance but the arrival time of the return current, as measured provisionally by the average electrotonic path length, determines whether the pyramidal cell (with ping-pong model dynamics) will burst or fire single spikes. | ||

48. | Impedance spectrum in cortical tissue: implications for LFP signal propagation (Miceli et al. 2017) | |

" ... Here, we performed a detailed investigation of the frequency dependence of the conductivity within cortical tissue at microscopic distances using small current amplitudes within the typical (neuro)physiological micrometer and sub-nanoampere range. We investigated the propagation of LFPs, induced by extracellular electrical current injections via patch-pipettes, in acute rat brain slice preparations containing the somatosensory cortex in vitro using multielectrode arrays. Based on our data, we determined the cortical tissue conductivity over a 100-fold increase in signal frequency (5-500 Hz). Our results imply at most very weak frequency-dependent effects within the frequency range of physiological LFPs. Using biophysical modeling, we estimated the impact of different putative impedance spectra. Our results indicate that frequency dependencies of the order measured here and in most other studies have negligible impact on the typical analysis and modeling of LFP signals from extracellular brain recordings." | ||

49. | Implementation issues in approximate methods for stochastic Hodgkin-Huxley models (Bruce 2007) | |

Four different algorithms for implementing Hodgkin–Huxley models with stochastic sodium channels: Strassberg and DeFelice (1993), Rubinstein (1995), Chow and White (1996), and Fox (1997) are compared. | ||

50. | Increased computational accuracy in multi-compartmental cable models (Lindsay et al. 2005) | |

Compartmental models of dendrites are the most widely used tool for investigating their electrical behaviour. Traditional models assign a single potential to a compartment. This potential is associated with the membrane potential at the centre of the segment represented by the compartment. All input to that segment, independent of its location on the segment, is assumed to act at the centre of the segment with the potential of the compartment. By contrast, the compartmental model introduced in this article assigns a potential to each end of a segment, and takes into account the location of input to a segment on the model solution by partitioning the effect of this input between the axial currents at the proximal and distal boundaries of segments. For a given neuron, the new and traditional approaches to compartmental modelling use the same number of locations at which the membrane potential is to be determined, and lead to ordinary differential equations that are structurally identical. However, the solution achieved by the new approach gives an order of magnitude better accuracy and precision than that achieved by the latter in the presence of point process input. | ||

51. | Inferring connection proximity in electrically coupled networks (Cali et al. 2007) | |

In order to explore electrical coupling in the nervous system and its network-level organization, it is imperative to map the electrical synaptic microcircuits, in analogy with in vitro studies on monosynaptic and disynaptic chemical coupling. However, walking from cell to cell over large distances with a glass pipette is challenging, and microinjection of (fluorescent) dyes diffusing through gap-junctions remains so far the only method available to decipher such microcircuits even though technical limitations exist. Based on circuit theory, we derived analytical descriptions of the AC electrical coupling in networks of isopotential cells. We then proposed an operative electrophysiological protocol to distinguish between direct electrical connections and connections involving one or more intermediate cells. This method allows inferring the number of intermediate cells, generalizing the conventional coupling coefficient, which provides limited information. We provide here some analysis and simulation scripts that used to test our method through computer simulations, in vitro recordings, theoretical and numerical methods. Key words: Gap-Junctions; Electrical Coupling; Networks; ZAP current; Impedance. | ||

52. | Ion channel modeling with whole cell and a genetic algorithm (Gurkiewicz and Korngreen 2007) | |

"... Here we show that a genetic search algorithm in combination with a gradient descent algorithm can be used to fit whole-cell voltage-clamp data to kinetic models with a high degree of accuracy. Previously, ion channel stimulation traces were analyzed one at a time, the results of these analyses being combined to produce a picture of channel kinetics. Here the entire set of traces from all stimulation protocols are analysed simultaneously. The algorithm was initially tested on simulated current traces produced by several Hodgkin-Huxley–like and Markov chain models of voltage-gated potassium and sodium channels. ... Finally, the algorithm was used for finding the kinetic parameters of several voltage-gated sodium and potassium channels models by matching its results to data recorded from layer 5 pyramidal neurons of the rat cortex in the nucleated outside-out patch configuration. The minimization scheme gives electrophysiologists a tool for reproducing and simulating voltage-gated ion channel kinetics at the cellular level." | ||

53. | KInNeSS : a modular framework for computational neuroscience (Versace et al. 2008) | |

The xml files provided here implement a network of excitatory and inhibitory spiking neurons, governed by either Hodgkin-Huxley or quadratic integrate-and-fire dynamical equations. The code is used to demonstrate the capabilities of the KInNeSS software package for simulation of networks of spiking neurons. The simulation protocol used here is meant to facilitate the comparison of KInNeSS with other simulators reviewed in <a href="http://dx.doi.org/10.1007/s10827-007-0038-6">Brette et al. (2007)</a>. See the associated paper "Versace et al. (2008) KInNeSS : a modular framework for computational neuroscience." for an extensive description of KInNeSS . | ||

54. | Local variable time step method (Lytton, Hines 2005) | |

The local variable time-step method utilizes separate variable step integrators for individual neurons in the network. It is most suitable for medium size networks in which average synaptic input intervals to a single cell are much greater than a fixed step dt. | ||

55. | Mapping function onto neuronal morphology (Stiefel and Sejnowski 2007) | |

"... We used an optimization procedure to find neuronal morphological structures for two computational tasks: First, neuronal morphologies were selected for linearly summing excitatory synaptic potentials (EPSPs); second, structures were selected that distinguished the temporal order of EPSPs. The solutions resembled the morphology of real neurons. In particular the neurons optimized for linear summation electrotonically separated their synapses, as found in avian nucleus laminaris neurons, and neurons optimized for spike-order detection had primary dendrites of significantly different diameter, as found in the basal and apical dendrites of cortical pyramidal neurons. ..." | ||

56. | Markov Chain-based Stochastic Shielding Hodgkin Huxley Model (Schmandt, Galan 2012) | |

57. | MATLAB for brain and cognitive scientists (Cohen 2017) | |

" ... MATLAB for Brain and Cognitive Scientists takes readers from beginning to intermediate and advanced levels of MATLAB programming, helping them gain real expertise in applications that they will use in their work. The book offers a mix of instructive text and rigorous explanations of MATLAB code along with programming tips and tricks. The goal is to teach the reader how to program data analyses in neuroscience and psychology. Readers will learn not only how to but also how not to program, with examples of bad code that they are invited to correct or improve. Chapters end with exercises that test and develop the skills taught in each chapter. Interviews with neuroscientists and cognitive scientists who have made significant contributions to their field using MATLAB appear throughout the book. ..." | ||

58. | Mature and young adult-born dentate granule cell models (T2N interface) (Beining et al. 2017) | |

... Here, we present T2N, a powerful interface to control NEURON with Matlab and TREES toolbox, which supports generating models stable over a broad range of reconstructed and synthetic morphologies. We illustrate this for a novel, highly-detailed active model of dentate granule cells (GCs) replicating a wide palette of experiments from various labs. By implementing known differences in ion channel composition and morphology, our model reproduces data from mouse or rat, mature or adult-born GCs as well as pharmacological interventions and epileptic conditions. ... T2N is suitable for creating robust models useful for large-scale networks that could lead to novel predictions. ..." See modeldb accession number 231818 for NEURON only code. | ||

59. | Mean Field Equations for Two-Dimensional Integrate and Fire Models (Nicola and Campbell, 2013) | |

The zip file contains the files used to perform numerical simulation and bifurcation studies of large networks of two-dimensional integrate and fire neurons and of the corresponding mean field models derived in our paper. The neural models used are the Izhikevich model and the Adaptive Exponential model. | ||

60. | Measuring neuronal identification quality in ensemble recordings (isoitools) (Neymotin et al. 2011) | |

"... Here we describe information theoretic measures of action potential waveform isolation applicable to any dataset, that have an intuitive, universal interpretation, and that are not dependent on the methods or choice of parameters for single unit isolation, and that have been validated using a dataset." | ||

61. | Method for counting motor units in mice (Major et al 2007) | |

"... Our goal was to develop an efficient method to determine the number of motor neurons making functional connections to muscle in a transgenic mouse model of amyotrophic lateral sclerosis (ALS). We developed a novel protocol for motor unit number estimation (MUNE) using incremental stimulation. The method involves analysis of twitch waveforms using a new software program, ITS-MUNE, designed for interactive calculation of motor unit number. The method was validated by testing simulated twitch data from a mathematical model of the neuromuscular system. Computer simulations followed the same stimulus-response protocol and produced waveform data that were indistinguishable from experiments. ... The ITS-MUNE analysis method has the potential to quantitatively measure the progression of motor neuron diseases and therefore the efficacy of treatments designed to alleviate pathologic processes of muscle denervation." The software is available for download under the "ITS-MUNE software" link at (see below for links)." | ||

62. | Method for deriving general HH neuron model`s spiking input-output relation (Soudry & Meir 2014) | |

We derived in paper a method to find semi-analytic input-output relations for general HH-like neuron models (firing rates, spectra, linear filters)under sparse spike stimulation. Here we demonstrate the applicability of this method to various HH-type models (HH with slow sodium inactivation, with slow pottasium inactivation, with synaptic STD and other various extensions). | ||

63. | Method of probabilistic principle surfaces (PPS) (Chang and Ghosh 2001) | |

Principal curves and surfaces are nonlinear generalizations of principal components and subspaces, respectively. They can provide insightful summary of high-dimensional data not typically attainable by classical linear methods. See paper for more and details. The matlab code supplied at the authors website calculates probabilistic principle surfaces on benchmark data sets. | ||

64. | Model predictive control model for an isometric motor task (Ueyama 2017) | |

A model predictive control model for an isometric motor task. | ||

65. | Modeling single neuron LFPs and extracellular potentials with LFPsim (Parasuram et al. 2016) | |

LFPsim - Simulation scripts to compute Local Field Potentials (LFP) from cable compartmental models of neurons and networks implemented in the NEURON simulation environment. | ||

66. | ModelView: online structural analysis of computational models (McDougal et al. 2015) | |

" ... To aid users, we have developed ModelView, a web application for NEURON models in ModelDB that presents a graphical view of model structure augmented with contextual information. Web presentation provides a rich, simulator-independent environment for interacting with graphs. The necessary data is generated by combining manual curation, text-mining the source code, querying ModelDB, and simulator introspection. ... With this tool, researchers can examine the structure of hundreds of models in ModelDB in a standardized presentation without installing any software, downloading the model, or reading model source code." | ||

67. | ModFossa: a library for modeling ion channels using Python (Ferneyhough et al 2016) | |

68. | Moose/PyMOOSE: interoperable scripting in Python for MOOSE (Ray and Bhalla 2008) | |

" ... We report the integration of Python scripting with the Multi-scale Object Oriented Simulation Environment (MOOSE). MOOSE is a general-purpose simulation system for compartmental neuronal models and for models of signaling pathways based on chemical kinetics. We show how the Python-scripting version of MOOSE, PyMOOSE, combines the power of a compiled simulator with the versatility and ease of use of Python. ... " | ||

69. | Motion Clouds: Synthesis of random textures for motion perception (Leon et al. 2012) | |

We describe a framework to generate random texture movies with controlled information content. In particular, these stimuli can be made closer to naturalistic textures compared to usual stimuli such as gratings and random-dot kinetograms. We simplified the definition to parametrically define these "Motion Clouds" around the most prevalent feature axis (mean and bandwith): direction, spatial frequency, orientation. | ||

70. | Motoneuron simulations for counting motor units (Major and Jones 2005) | |

Simulations of clinical methods to count the number of motoneurons/motor units in human patients. Models include stimulation of motor axons or voluntary activation and responses are measured as muscle tension or EMG. | ||

71. | NETMORPH: creates NNs with realistic neuron morphologies (Koene et al. 2009, van Ooyen et al. 2014) | |

NETMORPH is a simulation tool for building synaptically connected networks with realistic neuron morphologies. Axonal and dendritic morphologies are created by using stochastic rules for the behavior of individual growth cones, the structures at the tip of outgrowing axons and dendrites that mediate elongation and branching. Axons and dendrites are not guided by any extracellular cues. Synapses are formed when crossing axonal and dendritic segments come sufficiently close to each other. See the README in the archive for more information. | ||

72. | Networks of spiking neurons: a review of tools and strategies (Brette et al. 2007) | |

This package provides a series of codes that simulate networks of spiking neurons (excitatory and inhibitory, integrate-and-fire or Hodgkin-Huxley type, current-based or conductance-based synapses; some of them are event-based). The same networks are implemented in different simulators (NEURON, GENESIS, NEST, NCS, CSIM, XPP, SPLIT, MVAspike; there is also a couple of implementations in SciLab and C++). The codes included in this package are benchmark simulations; see the associated review paper (Brette et al. 2007). The main goal is to provide a series of benchmark simulations of networks of spiking neurons, and demonstrate how these are implemented in the different simulators overviewed in the paper. See also details in the enclosed file Appendix2.pdf, which describes these different benchmarks. Some of these benchmarks were based on the Vogels-Abbott model (Vogels TP and Abbott LF 2005). | ||

73. | Neural mass model based on single cell dynamics to model pathophysiology (Zandt et al 2014) | |

The model code as described in "A neural mass model based on single cell dynamics to model pathophysiology, Zandt et al. 2014, Journal of Computational Neuroscience" A Neural mass model (NMM) derived from single cell dynamics in a bottom up approach. Mean and standard deviation of the firing rates in the populations are calculated. The sigmoid is derived from the single cell FI-curve, allowing for easy implementation of pathological conditions. NMM is compared with a detailed spiking network model consisting of HH neurons. NMM code in Matlab. The network model is simulated using Norns (ModelDB # 154739) | ||

74. | Neural Query System NQS Data-Mining From Within the NEURON Simulator (Lytton 2006) | |

NQS is a databasing program with a query command modeled loosely on the SQL select command. Please see the manual NQS.pdf for details of use. An NQS database must be populated with data to be used. This package includes MFP (model fingerprint) which provides an example of NQS use with the model provided in the modeldb folder (see readme for usage). | ||

75. | NEUROFIT: fitting HH models to voltage clamp data (Willms 2002) | |

Publicly available software for accurate fitting of Hodgkin-Huxley models to voltage-clamp data... The set of parameter values for the model determined by this software yield current traces that are substantially closer to the observed data than those determined from the usual fitting method. This improvement is due to the fact that the software fits all of the parameters simultaneously utilizing all of the data rather than fitting steady-state and time constant parameters disjointly using peak currents and portions of the rising and falling phases... The software also incorporates a linear pre-estimation procedure to help in determining reasonable initial values for the full non-linear algorithm. See references for details and more. | ||

76. | NeuroManager: a workflow analysis based simulation management engine (Stockton & Santamaria 2015) | |

"We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. ..." | ||

77. | NEURON + Python (Hines et al. 2009) | |

The NEURON simulation program now allows Python to be used alone or in combination with NEURON's traditional Hoc interpreter. Adding Python to NEURON has the immediate benefit of making available a very extensive suite of analysis tools written for engineering and science. It also catalyzes NEURON software development by offering users a modern programming tool that is recognized for its flexibility and power to create and maintain complex programs. At the same time, nothing is lost because all existing models written in Hoc, including GUI tools, continue to work without change and are also available within the Python context. An example of the benefits of Python availability is the use of the xml module in implementing NEURON's Import3D and CellBuild tools to read MorphML and NeuroML model specifications. | ||

78. | NEURON interfaces to MySQL and the SPUD feature extraction algorithm (Neymotin et al. 2008) | |

See the readme.txt for information on setting up this interface to a MySQL server from the NEURON simulator. Note the SPUD feature extraction algorithm includes its own readme in the spud directory. | ||

79. | Neuron-based control mechanisms for a robotic arm and hand (Singh et al 2017) | |

"A robotic arm and hand controlled by simulated neurons is presented. The robot makes use of a biological neuron simulator using a point neural model. ... The robot performs a simple pick-and-place task. ... As another benefit, it is hoped that further work will also lead to a better understanding of human and other animal neural processing, particularly for physical motion. This is a multidisciplinary approach combining cognitive neuroscience, robotics, and psychology." | ||

80. | Neuronvisio: a gui with 3D capabilities for NEURON (Mattioni et al. 2012) | |

"The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three-dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://neuronvisio.org) aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model. ..." | ||

81. | Norns - Neural Network Studio (Visser & Van Gils 2014) | |

The Norns - Neural Network Studio is a software package for designing, simulation and analyzing networks of spiking neurons. It consists of three parts: 1. "Urd": a Matlab frontend with high-level functions for quickly defining networks 2. "Verdandi": an optimized C++ simulation environment which runs the simulation defined by Urd 3. "Skuld": an advanced Matlab graphical user interface (GUI) for visual inspection of simulated data. | ||

82. | Numerical Integration of Izhikevich and HH model neurons (Stewart and Bair 2009) | |

The Parker-Sochacki method is a new technique for the numerical integration of differential equations applicable to many neuronal models. Using this method, the solution order can be adapted according to the local conditions at each time step, enabling adaptive error control without changing the integration timestep. We apply the Parker-Sochacki method to the Izhikevich ‘simple’ model and a Hodgkin-Huxley type neuron, comparing the results with those obtained using the Runge-Kutta and Bulirsch-Stoer methods. | ||

83. | On stochastic diff. eq. models for ion channel noise in Hodgkin-Huxley neurons (Goldwyn et al. 2010) | |

" ... We analyze three SDE models that have been proposed as approximations to the Markov chain model: one that describes the states of the ion channels and two that describe the states of the ion channel subunits. We show that the former channel-based approach can capture the distribution of channel noise and its effect on spiking in a Hodgkin-Huxley neuron model to a degree not previously demonstrated, but the latter two subunit-based approaches cannot. ..." | ||

84. | Oversampling method to extract excitatory and inhibitory conductances (Bedard et al. 2012) | |

" ... We present here a new method that allows extracting estimates of the full time course of excitatory and inhibitory conductances from single-trial Vm recordings. This method is based on oversampling of the Vm . We test the method numerically using models of increasing complexity. Finally, the method is evaluated using controlled conductance injection in cortical neurons in vitro using the dynamic-clamp technique. ..." | ||

85. | Parallel network simulations with NEURON (Migliore et al 2006) | |

The NEURON simulation environment has been extended to support parallel network simulations. The performance of three published network models with very different spike patterns exhibits superlinear speedup on Beowulf clusters. | ||

86. | Parallel STEPS: Large scale stochastic spatial reaction-diffusion simulat. (Chen & De Schutter 2017) | |

" ... In this paper, we describe an MPI-based, parallel operator-splitting implementation for stochastic spatial reaction-diffusion simulations with irregular tetrahedral meshes. The performance of our implementation is first examined and analyzed with simulations of a simple model. We then demonstrate its application to real-world research by simulating the reaction-diffusion components of a published calcium burst model in both Purkinje neuron sub-branch and full dendrite morphologies..." | ||

87. | Parallelizing large networks in NEURON (Lytton et al. 2016) | |

"Large multiscale neuronal network simulations and innovative neurotechnologies are required for development of these models requires development of new simulation technologies. We describe here the current use of the NEURON simulator with MPI (message passing interface) for simulation in the domain of moderately large networks on commonly available High Performance Computers (HPCs). We discuss the basic layout of such simulations, including the methods of simulation setup, the run-time spike passing paradigm and post-simulation data storage and data management approaches. We also compare three types of networks, ..." | ||

88. | Phase-locking analysis with transcranial magneto-acoustical stimulation (Yuan et al 2017) | |

"Transcranial magneto-acoustical stimulation (TMAS) uses ultrasonic waves and a static magnetic field to generate electric current in nerve tissues for the purpose of modulating neuronal activities. It has the advantage of high spatial resolution and penetration depth. Neuronal firing rhythms carry and transmit nerve information in neural systems. In this study, we investigated the phase-locking characteristics of neuronal firing rhythms with TMAS based on the Hodgkin-Huxley neuron model. The simulation results indicate that the modulation frequency of ultrasound can affect the phase-locking behaviors. The results of this study may help us to explain the potential firing mechanism of TMAS." | ||

89. | Properties of aconitine-induced block of KDR current in NG108-15 neurons (Lin et al. 2008) | |

"The effects of aconitine (ACO), a highly toxic alkaloid, on ion currents in differentiated NG108-15 neuronal cells were investigated in this study. ACO (0.3-30 microM) suppressed the amplitude of delayed rectifier K+ current (IK(DR)) in a concentration-dependent manner with an IC50 value of 3.1 microM. The presence of ACO enhanced the rate and extent of IK(DR) inactivation, although it had no effect on the initial activation phase of IK(DR). ... A modeled cell was designed to duplicate its inhibitory effect on spontaneous pacemaking. ... Taken together, the experimental data and simulations show that ACO can block delayed rectifier K+ channels of neurons in a concentration- and state-dependent manner. Changes in action potentials induced by ACO in neurons in vivo can be explained mainly by its blocking actions on IK(DR) and INa." | ||

90. | PyRhO: A multiscale optogenetics simulation platform (Evans et al 2016) | |

"... we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. ..." | ||

91. | Python demo of the VmT method to extract conductances from single Vm traces (Pospischil et al. 2009) | |

This python code implements a method to estimate synaptic conductances from single membrane potential traces (the "VmT method"), as described in Pospischil et al. (2009). The method uses a maximum likelihood procedure and was successfully tested using models and dynamic-clamp experiments in vitro (see paper for details). | ||

92. | Quantitative assessment of computational models for retinotopic map formation (Hjorth et al. 2015) | |

"Molecular and activity-based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. ..." | ||

93. | Recording from rod bipolar axon terminals in situ (Oltedal et al 2007) | |

"... Whole cell recordings from axon terminals and cell bodies were used to investigate the passive membrane properties of rod bipolar cells and analyzed with a two-compartment equivalent electrical circuit model developed by Mennerick et al. For both terminal- and soma-end recordings, capacitive current decays were well fitted by biexponential functions. Computer simulations of simplified models of rod bipolar cells demonstrated that estimates of the capacitance of the axon terminal compartment can depend critically on the recording location, with terminal-end recordings giving the best estimates. Computer simulations and whole cell recordings demonstrated that terminal-end recordings can yield more accurate estimates of the peak amplitude and kinetic properties of postsynaptic currents generated at the axon terminals due to increased electrotonic filtering of these currents when recorded at the soma. ..." See paper for more and details. | ||

94. | Reduction of nonlinear ODE systems possessing multiple scales (Clewley et al. 2005) | |

" ... We introduce a combined numerical and analytical technique that aids the identification of structure in a class of systems of nonlinear ordinary differential equations (ODEs) that are commonly applied in dynamical models of physical processes. ... These methods have been incorporated into a new software tool named Dssrt, which we demonstrate on a limit cycle of a synaptically driven Hodgkin–Huxley neuron model." | ||

95. | Response properties of neocort. neurons to temporally modulated noisy inputs (Koendgen et al. 2008) | |

Neocortical neurons are classified by current–frequency relationship. This is a static description and it may be inadequate to interpret neuronal responses to time-varying stimuli. Theoretical studies (Brunel et al., 2001; Fourcaud-Trocmé et al. 2003; Fourcaud-Trocmé and Brunel 2005; Naundorf et al. 2005) suggested that single-cell dynamical response properties are necessary to interpret ensemble responses to fast input transients. Further, it was shown that input-noise linearizes and boosts the response bandwidth, and that the interplay between the barrage of noisy synaptic currents and the spike-initiation mechanisms determine the dynamical properties of the firing rate. In order to allow a reader to explore such simulations, we prepared a simple NEURON implementation of the experiments performed in Köndgen et al., 2008 (see also Fourcaud-Trocmé al. 2003; Fourcaud-Trocmé and Brunel 2005). In addition, we provide sample MATLAB routines for exploring the sandwich model proposed in Köndgen et al., 2008, employing a simple frequdency-domain filtering. The simulations and the MATLAB routines are based on the linear response properties of layer 5 pyramidal cells estimated by injecting a superposition of a small-amplitude sinusoidal wave and a background noise, as in Köndgen et al., 2008. | ||

96. | Reverse-time correlation analysis for idealized orientation tuning dynamics (Kovacic et al. 2008) | |

"A theoretical analysis is presented of a reverse-time correlation method used in experimentally investigating orientation tuning dynamics of neurons in the primary visual cortex. An exact mathematical characterization of the method is developed, and its connection with the Volterra–Wiener nonlinear systems theory is described. Various mathematical consequences and possible physiological implications of this analysis are illustrated using exactly solvable idealized models of orientation tuning." | ||

97. | Simulating ion channel noise in an auditory brainstem neuron model (Schmerl & McDonnell 2013) | |

" ... Here we demonstrate that biophysical models of channel noise can give rise to two kinds of recently discovered stochastic facilitation effects in a Hodgkin-Huxley-like model of auditory brainstem neurons. The first, known as slope-based stochastic resonance (SBSR), enables phasic neurons to emit action potentials that can encode the slope of inputs that vary slowly relative to key time constants in the model. The second, known as inverse stochastic resonance (ISR), occurs in tonically firing neurons when small levels of noise inhibit tonic firing and replace it with burstlike dynamics. ..." Preprint available at http://arxiv.org/abs/1311.2643 | ||

98. | Sloppy morphological tuning in identified neurons of the crustacean STG (Otopalik et al 2017) | |

" ...Theoretical studies suggest that morphology is tightly tuned to minimize wiring and conduction delay of synaptic events. We utilize high-resolution confocal microscopy and custom computational tools to characterize the morphologies of four neuron types in the stomatogastric ganglion (STG) of the crab Cancer borealis. Macroscopic branching patterns and fine cable properties are variable within and across neuron types. We compare these neuronal structures to synthetic minimal spanning neurite trees constrained by a wiring cost equation and find that STG neurons do not adhere to prevailing hypotheses regarding wiring optimization principles. In this highly-modulated and oscillating circuit, neuronal structures appear to be governed by a space-filling mechanism that outweighs the cost of inefficient wiring." | ||

99. | Smoothing of, and parameter estimation from, noisy biophysical recordings (Huys & Paninski 2009) | |

" ... Sequential Monte Carlo (“particle filtering”) methods, in combination with a detailed biophysical description of a cell, are used for principled, model-based smoothing of noisy recording data. We also provide an alternative formulation of smoothing where the neural nonlinearities are estimated in a non-parametric manner. Biophysically important parameters of detailed models (such as channel densities, intercompartmental conductances, input resistances, and observation noise) are inferred automatically from noisy data via expectation-maximisation. ..." | ||

100. | Software (called Optimizer) for fitting neuronal models (Friedrich et al. 2014) | |

" ... Our software, called Optimizer, implements a modular and extensible framework for the optimization of neuronal models, and also features a graphical interface which makes it easy for even non-expert users to handle many commonly occurring scenarios. ..." | ||

101. | Spatial gridding and temporal accuracy in NEURON (Hines and Carnevale 2001) | |

A heuristic for compartmentalization based on the space constant at 100 Hz is proposed. The paper also discusses spatio/temporal accuracy and the use of CVODE. | ||

102. | Spectral method and high-order finite differences for nonlinear cable (Omurtag and Lytton 2010) | |

We use high-order approximation schemes for the space derivatives in the nonlinear cable equation and investigate the behavior of numerical solution errors by using exact solutions, where available, and grid convergence. The space derivatives are numerically approximated by means of differentiation matrices. A flexible form for the injected current is used that can be adjusted smoothly from a very broad to a narrow peak, which leads, for the passive cable, to a simple, exact solution. We provide comparisons with exact solutions in an unbranched passive cable, the convergence of solutions with progressive refinement of the grid in an active cable, and the simulation of spike initiation in a biophysically realistic single-neuron model. | ||

103. | Spike exchange methods for a Blue Gene/P supercomputer (Hines et al., 2011) | |

Tests several spike exchange methods on a Blue Gene/P supercomputer on up to 64K cores. | ||

104. | Structure-dynamics relationships in bursting neuronal networks revealed (Mäki-Marttunen et al. 2013) | |

This entry includes tools for generating and analyzing network structure, and for running the neuronal network simulations on them. | ||

105. | The cannula artifact (Chandler & Hodgkin 1965) | |

Chandler and Hodgkin 1965 describes how using a high impedance electrode can lead to squid axon recordings that appear to overshoot the sodium reversal potential, thus resolving controversial recordings at the time. | ||

106. | Theta phase precession in a model CA3 place cell (Baker and Olds 2007) | |

"... The present study concerns a neurobiologically based computational model of the emergence of theta phase precession in which the responses of a single model CA3 pyramidal cell are examined in the context of stimulation by realistic afferent spike trains including those of place cells in entorhinal cortex, dentate gyrus, and other CA3 pyramidal cells. Spike-timing dependent plasticity in the model CA3 pyramidal cell leads to a spatially correlated associational synaptic drive that subsequently creates a spatially asymmetric expansion of the model cell’s place field. ... Through selective manipulations of the model it is possible to decompose theta phase precession in CA3 into the separate contributing factors of inheritance from upstream afferents in the dentate gyrus and entorhinal cortex, the interaction of synaptically controlled increasing afferent drive with phasic inhibition, and the theta phase difference between dentate gyrus granule cell and CA3 pyramidal cell activity." | ||

107. | Translating network models to parallel hardware in NEURON (Hines and Carnevale 2008) | |

Shows how to move a working network model written in NEURON from a serial processor to a parallel machine in such a way that the final result will produce numerically identical results on either serial or parallel hardware. | ||

108. | Vectorized algorithms for spiking neural network simulation (Brette and Goodman 2011) | |

"... We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages." | ||

109. | Voltage and light-sensitive Channelrhodopsin-2 model (ChR2) (Williams et al. 2013) | |

" ... Focusing on one of the most widely used ChR2 mutants (H134R) with enhanced current, we collected a comprehensive experimental data set of the response of this ion channel to different irradiances and voltages, and used these data to develop a model of ChR2 with empirically-derived voltage- and irradiance- dependence, where parameters were fine-tuned via simulated annealing optimization. This ChR2 model offers: 1) accurate inward rectification in the current-voltage response across irradiances; 2) empirically-derived voltage- and light-dependent kinetics (activation, deactivation and recovery from inactivation); and 3) accurate amplitude and morphology of the response across voltage and irradiance settings. Temperature-scaling factors (Q10) were derived and model kinetics was adjusted to physiological temperatures. ... " |