Models that contain the Modeling Application : MATLAB (web link to model) (Home Page)

( MATLAB integrates mathematical computing, visualization, and a powerful language to provide a flexible environment for technical computing. The open architecture makes it easy to use MATLAB and its companion products to explore data, create algorithms, and create custom tools that provide early insights and competitive advantages.)
Re-display model names without descriptions
    Models   Description
1.  A cardiac cell simulator (Puglisi and Bers 2001), applied to the QT interval (Busjahn et al 2004)
"LabHEART is an easy to use program that simulates the cardiac action potential, calcium transient and ionic currents. Key parameters such as ionic concentration, stimulus waveform and channel conductance can easily be changed by a click on an icon or dragging a slider. It is a powerfull tool for teaching and researching cardiac electrophysiology."
2.  A comparison of mathematical models of mood in bipolar disorder (Cochran et al. 2017)
" ... we evaluated existing models of mood in BP (Bipolar Disorder) (...) and two new models we proposed here (...). Each model makes different assumptions about mood dynamics. Our objective was to differentiate between models using only time courses of mood. ..."
3.  A CORF computational model of a simple cell that relies on LGN input (Azzopardi & Petkov 2012)
"... We propose a computational model that uses as afferent inputs the responses of model LGN cells with center-surround receptive fields (RFs) and we refer to it as a Combination of Receptive Fields (CORF) model. We use shifted gratings as test stimuli and simulated reverse correlation to explore the nature of the proposed model. We study its behavior regarding the effect of contrast on its response and orientation bandwidth as well as the effect of an orthogonal mask on the response to an optimally oriented stimulus. We also evaluate and compare the performances of the CORF and GF (Gabor Filter) models regarding contour detection, using two public data sets of images of natural scenes with associated contour ground truths. ... The proposed CORF model is more realistic than the GF model and is more effective in contour detection, which is assumed to be the primary biological role of simple cells."
4.  A detailed and fast model of extracellular recordings (Camunas-Mesa & Qurioga 2013)
"We present a novel method to generate realistic simulations of extracellular recordings. The simulations were obtained by superimposing the activity of neurons placed randomly in a cube of brain tissue. Detailed models of individual neurons were used to reproduce the extracellular action potentials of close-by neurons. ..."
5.  A dynamic model of the canine ventricular myocyte (Hund, Rudy 2004)
The Hund-Rudy dynamic (HRd) model is based on data from the canine epicardial ventricular myocyte. Rate-dependent phenomena associated with ion channel kinetics, action potential properties and Ca2+ handling are simulated by the model. See paper for more and details.
6.  A neural mass model of cross frequency coupling (Chehelcheraghi et al 2017)
"Electrophysiological signals of cortical activity show a range of possible frequency and amplitude modulations, both within and across regions, collectively known as cross-frequency coupling. To investigate whether these modulations could be considered as manifestations of the same underlying mechanism, we developed a neural mass model. The model provides five out of the theoretically proposed six different coupling types. ..."
7.  A neurocomputational model of classical conditioning phenomena (Moustafa et al. 2009)
"... Here, we show that the same information-processing function proposed for the hippocampal region in the Gluck and Myers (1993) model can also be implemented in a network without using the backpropagation algorithm. Instead, our newer instantiation of the theory uses only (a) Hebbian learning methods which match more closely with synaptic and associative learning mechanisms ascribed to the hippocampal region and (b) a more plausible representation of input stimuli. We demonstrate here that this new more biologically plausible model is able to simulate various behavioral effects, including latent inhibition, acquired equivalence, sensory preconditioning, negative patterning, and context shift effects. ..."
8.  A reinforcement learning example (Sutton and Barto 1998)
This MATLAB script demonstrates an example of reinforcement learning functions guiding the movements of an agent (a black square) in a gridworld environment. See at the top of the matlab script and the book for more details.
9.  A simple integrative electrophysiological model of bursting GnRH neurons (Csercsik et al. 2011)
In this paper a modular model of the GnRH neuron is presented. For the aim of simplicity, the currents corresponding to fast time scales and action potential generation are described by an impulsive system, while the slower currents and calcium dynamics are described by usual ordinary differential equations (ODEs). The model is able to reproduce the depolarizing afterpotentials, afterhyperpolarization, periodic bursting behavior and the corresponding calcium transients observed in the case of GnRH neurons.
10.  Activity constraints on stable neuronal or network parameters (Olypher and Calabrese 2007)
"In this study, we developed a general description of parameter combinations for which specified characteristics of neuronal or network activity are constant. Our approach is based on the implicit function theorem and is applicable to activity characteristics that smoothly depend on parameters. Such smoothness is often intrinsic to neuronal systems when they are in stable functional states. The conclusions about how parameters compensate each other, developed in this study, can thus be used even without regard to the specific mathematical model describing a particular neuron or neuronal network. ..."
11.  Analyzing neural time series data theory and practice (Cohen 2014)
"This book offers a comprehensive guide to the theory and practice of analyzing electrical brain signals. It explains the conceptual, mathematical, and implementational (via Matlab programming) aspects of time-, time-frequency- and synchronization-based analyses of magnetoencephalography (MEG), electroencephalography (EEG), and local field potential (LFP) recordings from humans and nonhuman animals."
12.  Auditory nerve spontaneous rate histograms (Jackson and Carney 2005)
Histograms of spontaneous rate estimates of auditory nerve are well reproduced by models with two or three spontaneous rates and long range dependence.
13.  Brain Dynamics Toolbox (Heitmann & Breakspear 2016, 2017, 2018)
"The Brain Dynamics Toolbox is open-source software for simulating dynamical systems in neuroscience. It is for researchers and students who wish to explore mathematical models of brain function using Matlab. It includes a graphical tool for simulating dynamical systems in real-time as well as command-line tools for scripting large-scale simulations."
14.  Cat auditory nerve model (Zilany and Bruce 2006, 2007)
"This paper presents a computational model to simulate normal and impaired auditory-nerve (AN) fiber responses in cats. The model responses match physiological data over a wider dynamic range than previous auditory models. This is achieved by providing two modes of basilar membrane excitation to the inner hair cell (IHC) rather than one. ... The model responses are consistent with a wide range of physiological data from both normal and impaired ears for stimuli presented at levels spanning the dynamic range of hearing."
15.  Cochlear implant models (Bruce et al. 1999a, b, c, 2000)
"In a recent set of modeling studies we have developed a stochastic threshold model of auditory nerve response to single biphasic electrical pulses (Bruce et al., 1999c) and moderate rate (less than 800 pulses per second) pulse trains (Bruce et al., 1999a). In this article we derive an analytical approximation for the single-pulse model, which is then extended to describe the pulse-train model in the case of evenly timed, uniform pulses. This renewal-process description provides an accurate and computationally efficient model of electrical stimulation of single auditory nerve fibers by a cochlear implant that may be extended to other forms of electrical neural stimulation."
16.  Continuous lateral oscillations as a mechanism for taxis in Drosophila larvae (Wystrach et al 2016)
" ...Our analysis of larvae motion reveals a rhythmic, continuous lateral oscillation of the anterior body, encompassing all head-sweeps, small or large, without breaking the oscillatory rhythm. Further, we show that an agent-model that embeds this hypothesis reproduces a surprising number of taxis signatures observed in larvae. Also, by coupling the sensory input to a neural oscillator in continuous time, we show that the mechanism is robust and biologically plausible. ..."
17.  Dynamics of sleep oscillations coupled to brain temperature on multiple scales (Csernai et al 2019)
"Every form of neural activity depends on temperature, yet its relationship to brain rhythms is poorly understood. In this work we examined how sleep spindles are influenced by changing brain temperatures and how brain temperature is influenced by sleep oscillations. We employed a novel thermoelectrode designed for measuring temperature while recording neural activity. We found that spindle frequency is positively correlated and duration negatively correlated with brain temperature. Local heating of the thalamus replicated the temperature dependence of spindle parameters in the heated area only, suggesting biophysical rather than global modulatory mechanisms, a finding also supported by a thalamic network model. Finally, we show that switches between oscillatory states also influence brain temperature on a shorter and smaller scale. Epochs of paradoxical sleep as well as the infra-slow oscillation were associated with brain temperature fluctuations below 0.2°C. Our results highlight that brain temperature is massively intertwined with sleep oscillations on various time scales."
18.  DynaSim: a MATLAB toolbox for neural modeling and simulation (Sherfey et al 2018)
"DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems...."
19.  Evaluation of stochastic diff. eq. approximation of ion channel gating models (Bruce 2009)
Fox and Lu derived an algorithm based on stochastic differential equations for approximating the kinetics of ion channel gating that is simpler and faster than "exact" algorithms for simulating Markov process models of channel gating. However, the approximation may not be sufficiently accurate to predict statistics of action potential generation in some cases. The objective of this study was to develop a framework for analyzing the inaccuracies and determining their origin. Simulations of a patch of membrane with voltage-gated sodium and potassium channels were performed using an exact algorithm for the kinetics of channel gating and the approximate algorithm of Fox & Lu. ... The results indicate that: (i) the source of the inaccuracy is that the Fox & Lu algorithm does not adequately describe the combined behavior of the multiple activation particles in each sodium and potassium channel, and (ii) the accuracy does not improve with increasing numbers of channels.
20.  Fixed point attractor (Hasselmo et al 1995)
"... In the model, cholinergic suppression of synaptic transmission at excitatory feedback synapses is shown to determine the extent to which activity depends upon new features of the afferent input versus components of previously stored representations. ..." See paper for more and details. The MATLAB script demonstrates the model of fixed point attractors mediated by excitatory feedback with subtractive inhibition in a continuous firing rate model.
21.  Gamma and theta rythms in biophysical models of hippocampus circuits (Kopell et al. 2011)
" ... the main rhythms displayed by the hippocampus, the gamma (30–90 Hz) and theta (4–12 Hz) rhythms. We concentrate on modeling in vitro experiments, but with an eye toward possible in vivo implications. ... We use simpler biophysical models; all cells have a single compartment only, and the interneurons are restricted to two types: fast-spiking (FS) basket cells and oriens lacunosum-moleculare (O-LM) cells. ... , we aim not so much at reproducing dynamics in great detail, but at clarifying the essential mechanisms underlying the production of the rhythms and their interactions (Kopell, 2005). ..."
22.  Generating coherent patterns of activity from chaotic neural networks (Sussillo and Abbott 2009)
"Neural circuits display complex activity patterns both spontaneously and when responding to a stimulus or generating a motor output. How are these two forms of activity related? We develop a procedure called FORCE learning for modifying synaptic strengths either external to or within a model neural network to change chaotic spontaneous activity into a wide variety of desired activity patterns. ... Our results reproduce data on premovement activity in motor and premotor cortex, and suggest that synaptic plasticity may be a more rapid and powerful modulator of network activity than generally appreciated."
23.  High-Res. Recordings Using a Real-Time Computational Model of the Electrode (Brette et al. 2008)
"Intracellular recordings of neuronal membrane potential are a central tool in neurophysiology. ... We introduce a computer-aided technique, Active Electrode Compensation (AEC), based on a digital model of the electrode interfaced in real time with the electrophysiological setup. ... AEC should be particularly useful to characterize fast neuronal phenomena intracellularly in vivo."
24.  Hippocampal context-dependent retrieval (Hasselmo and Eichenbaum 2005)
"... The model simulates the context-sensitive firing properties of hippocampal neurons including trial-specific firing during spatial alternation and trial by trial changes in theta phase precession on a linear track. ..." See paper for more and details.
25.  Hodgkin–Huxley model with fractional gating (Teka et al. 2016)
We use fractional order derivatives to model the kinetic dynamics of the gate variables for the potassium and sodium conductances of the Hodgkin-Huxley model. Our results show that power-law dynamics of the different gate variables result in a wide range of action potential shapes and spiking patterns, even in the case where the model was stimulated with constant current. As a consequence, power-law behaving conductances result in an increase in the number of spiking patterns a neuron can generate and, we propose, expand the computational capacity of the neuron.
26.  Human seizures couple across spatial scales through travelling wave dynamics (Martinet et al 2017)
" ... We show that during seizure large-scale neural populations spanning centimetres of cortex coordinate with small neural groups spanning cortical columns, and provide evidence that rapidly propagating waves of activity underlie this increased inter-scale coupling. We develop a corresponding computational model to propose specific mechanisms—namely, the effects of an increased extracellular potassium concentration diffusing in space—that support the observed spatiotemporal dynamics. Understanding the multi-scale, spatiotemporal dynamics of human seizures—and connecting these dynamics to specific biological mechanisms—promises new insights to treat this devastating disease.
27.  Implementation issues in approximate methods for stochastic Hodgkin-Huxley models (Bruce 2007)
Four different algorithms for implementing Hodgkin–Huxley models with stochastic sodium channels: Strassberg and DeFelice (1993), Rubinstein (1995), Chow and White (1996), and Fox (1997) are compared.
28.  Inhibitory cells enable sparse coding in V1 model (King et al. 2013)
" ... Here we show that adding a separate population of inhibitory neurons to a spiking model of V1 provides conformance to Dale’s Law, proposes a computational role for at least one class of interneurons, and accounts for certain observed physiological properties in V1. ... "
29.  Integrate and fire model code for spike-based coincidence-detection (Heinz et al. 2001, others)
Model code relevant to three papers; two on level discrimination and one on masked detection at low frequencies.
30.  Logarithmic distributions prove that intrinsic learning is Hebbian (Scheler 2017)
"In this paper, we present data for the lognormal distributions of spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas examined. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears to be a general, functional property in all cases analyzed. We then created a generic neural model to investigate adaptive learning rules that create and maintain lognormal distributions. We conclusively demonstrate that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This provides a solution to the long-standing question about the type of plasticity exhibited by intrinsic excitability."
31.  Long-term adaptation with power-law dynamics (Zilany et al. 2009)
... A model of rate adaptation at the synapse between inner hair cells and auditory-nerve (AN) fibers that includes both exponential and power-law dynamics is presented here. Exponentially adapting components with rapid and short-term time constants, which are mainly responsible for shaping onset responses, are followed by two parallel paths with power-law adaptation that provide slowly and rapidly adapting responses. ... The proposed model is capable of accurately predicting several sets of AN data, including amplitude-modulation transfer functions, long-term adaptation, forward masking, and adaptation to increments and decrements in the amplitude of an ongoing stimulus.
32.  Loss of phase-locking in non-weakly coupled inhib. networks of type-I neurons (Oh and Matveev 2009)
... Here we examine the loss of synchrony caused by an increase in inhibitory coupling in networks of type-I Morris–Lecar model oscillators, which is characterized by a period-doubling cascade and leads to mode-locked states with alternation in the firing order of the two cells, as reported recently by Maran and Canavier (J Comput Nerosci, 2008) for a network of Wang-Buzsáki model neurons. Although alternating-order firing has been previously reported as a near-synchronous state, we show that the stable phase difference between the spikes of the two Morris–Lecar cells can constitute as much as 70% of the unperturbed oscillation period. Further, we examine the generality of this phenomenon for a class of type-I oscillators that are close to their excitation thresholds, and provide an intuitive geometric description of such “leap-frog” dynamics. ..."
33.  Mathematics for Neuroscientists (Gabbiani and Cox 2010)
This textbook provides a good source for learning the mathematics relevant to computational neuroscience and also the neuroscience itself. There are 232 computer code examples from the book available through the http://www.elsevierdirect.com/companions/9780123748829/pictures/code/index.html code link here and in the below page copied from the books companion web site.
34.  MATLAB for brain and cognitive scientists (Cohen 2017)
" ... MATLAB for Brain and Cognitive Scientists takes readers from beginning to intermediate and advanced levels of MATLAB programming, helping them gain real expertise in applications that they will use in their work. The book offers a mix of instructive text and rigorous explanations of MATLAB code along with programming tips and tricks. The goal is to teach the reader how to program data analyses in neuroscience and psychology. Readers will learn not only how to but also how not to program, with examples of bad code that they are invited to correct or improve. Chapters end with exercises that test and develop the skills taught in each chapter. Interviews with neuroscientists and cognitive scientists who have made significant contributions to their field using MATLAB appear throughout the book. ..."
35.  Mature and young adult-born dentate granule cell models (T2N interface) (Beining et al. 2017)
... Here, we present T2N, a powerful interface to control NEURON with Matlab and TREES toolbox, which supports generating models stable over a broad range of reconstructed and synthetic morphologies. We illustrate this for a novel, highly-detailed active model of dentate granule cells (GCs) replicating a wide palette of experiments from various labs. By implementing known differences in ion channel composition and morphology, our model reproduces data from mouse or rat, mature or adult-born GCs as well as pharmacological interventions and epileptic conditions. ... T2N is suitable for creating robust models useful for large-scale networks that could lead to novel predictions. ..." See modeldb accession number 231818 for NEURON only code.
36.  Method for counting motor units in mice (Major et al 2007)
"... Our goal was to develop an efficient method to determine the number of motor neurons making functional connections to muscle in a transgenic mouse model of amyotrophic lateral sclerosis (ALS). We developed a novel protocol for motor unit number estimation (MUNE) using incremental stimulation. The method involves analysis of twitch waveforms using a new software program, ITS-MUNE, designed for interactive calculation of motor unit number. The method was validated by testing simulated twitch data from a mathematical model of the neuromuscular system. Computer simulations followed the same stimulus-response protocol and produced waveform data that were indistinguishable from experiments. ... The ITS-MUNE analysis method has the potential to quantitatively measure the progression of motor neuron diseases and therefore the efficacy of treatments designed to alleviate pathologic processes of muscle denervation." The software is available for download under the "ITS-MUNE software" link at (see below for links)."
37.  Method of probabilistic principle surfaces (PPS) (Chang and Ghosh 2001)
Principal curves and surfaces are nonlinear generalizations of principal components and subspaces, respectively. They can provide insightful summary of high-dimensional data not typically attainable by classical linear methods. See paper for more and details. The matlab code supplied at the authors website calculates probabilistic principle surfaces on benchmark data sets.
38.  Microglial cytokine network (Anderson et al., 2015)
This is an ODE model of autocrine/paracrine microglial cytokine interactions. Simulations include analyses of neuroinflammation mechanisms in the context of adaptation and tolerance to LPS.
39.  Model of neural responses to amplitude-modulated tones (Nelson and Carney 2004)
"A phenomenological model with time-varying excitation and inhibition was developed to study possible neural mechanisms underlying changes in the representation of temporal envelopes along the auditory pathway. A modified version of an existing auditory-nerve model (Zhang et al., J. Acoust. Soc. Am. 109, 648–670 (2001) was used to provide inputs to higher hypothetical processing centers. Model responses were compared directly to published physiological data at three levels: the auditory nerve, ventral cochlear nucleus, and inferior colliculus. ..."
40.  Modeling conductivity profiles in the deep neocortical pyramidal neuron (Wang K et al. 2013)
"With the rapid increase in the number of technologies aimed at observing electric activity inside the brain, scientists have felt the urge to create proper links between intracellular- and extracellular-based experimental approaches. Biophysical models at both physical scales have been formalized under assumptions that impede the creation of such links. In this work, we address this issue by proposing amulticompartment model that allows the introduction of complex extracellular and intracellular resistivity profiles. This model accounts for the geometrical and electrotonic properties of any type of neuron through the combination of four devices: the integrator, the propagator, the 3D connector, and the collector. ..."
41.  Models analysis for auditory-nerve synapse (Zhang and Carney 2005)
"A general mathematical approach was proposed to study phenomenological models of the inner-hair-cell and auditory-nerve (AN) synapse complex. Two models (Meddis, 1986; Westerman and Smith, 1988) were studied using this unified approach. The responses of both models to a constant-intensity stimulus were described mathematically, and the relationship between model parameters and response characteristics was investigated. ...". The paper then modifies these to make a more physiologically realistic model.
42.  Models for diotic and dichotic detection (Davidson et al. 2009)
Several psychophysical models for masked detection were evaluated using reproducible noises. The data were hit and false-alarm rates from three psychophysical studies of detection of 500-Hz tones in reproducible noise under diotic (N0S0) and dichotic (N0Spi) conditions with four stimulus bandwidths (50, 100, 115, and 2900 Hz). Diotic data were best predicted by an energy-based multiple-detector model that linearly combined stimulus energies at the outputs of several critical-band filters. The tone-plus-noise trials in the dichotic data were best predicted by models that linearly combined either the average values or the standard deviations of interaural time and level differences; however, these models offered no predictions for noise-alone responses. ...". The Breebart et al. 2001 and the Dau et al. 1996 models are supplied at the Carney lab web site.
43.  Multi-timescale adaptive threshold model (Kobayashi et al 2009)
" ... In this study, we devised a simple, fast computational model that can be tailored to any cortical neuron not only for reproducing but also for predicting a variety of spike responses to greatly fluctuating currents. The key features of this model are a multi-timescale adaptive threshold predictor and a nonresetting leaky integrator. This model is capable of reproducing a rich variety of neuronal spike responses, including regular spiking, intrinsic bursting, fast spiking, and chattering, by adjusting only three adaptive threshold parameters. ..."
44.  Multistability of clustered states in a globally inhibitory network (Chandrasekaran et al. 2009)
"We study a network of m identical excitatory cells projecting excitatory synaptic connections onto a single inhibitory interneuron, which is reciprocally coupled to all excitatory cells through inhibitory synapses possessing short-term synaptic depression. We find that such a network with global inhibition possesses multiple stable activity patterns with distinct periods, characterized by the clustering of the excitatory cells into synchronized sub-populations. We prove the existence and stability of n-cluster solutions in a m-cell network. ... Implications for temporal coding and memory storage are discussed."
45.  Network topologies for producing limited sustained activation (Kaiser and Hilgetag 2010)
Uses networks of cellular automata to test hypotheses about network topologies that can produce limited, sustained activity. Inspired by empirically-based ideas about neocortical architecture, but conceived and implemented at a level of abstraction that is not closely linked to empirical observations.
46.  Neural mass model of spindle generation in the isolated thalamus (Schellenberger Costa et al. 2016)
The model generates different oscillatory patterns in the thalamus, including delta and spindle band oscillations.
47.  Neural mass model of the neocortex under sleep regulation (Costa et al 2016)
This model generates typical human EEG patterns of sleep stages N2/N3 as well as wakefulness and REM. It further contains a sleep regulatory component, that lets the model transition between those stages independently
48.  Neural mass model of the sleeping thalamocortical system (Schellenberger Costa et al 2016)
This paper generates typical human EEG data of sleep stages N2/N3 as well as wakefulness and REM sleep.
49.  Neural model of two-interval discrimination (Machens et al 2005)
Two-interval discrimination involves comparison of two stimuli that are presented at different times. It has three phases: loading, in which the first stimulus is perceived and stored in working memory; maintenance of working memory; decision making, in which the second stimulus is perceived and compared with the first. In behaving monkeys, each phase is associated with characteristic firing activity of neurons in the prefrontal cortex. This model implements both working memory and decision making with a mutual inhibition network that reproduces all three phases of two-interval discrimination. Machens, C.K., Romo, R., and Brody, C.D. Flexible control of mutual inhibition: a neural model of two-interval discrimination. Science 307:1121-1124, 2005.
50.  NeuroManager: a workflow analysis based simulation management engine (Stockton & Santamaria 2015)
"We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. ..."
51.  NN for proto-object based contour integration and figure-ground segregation (Hu & Niebur 2017)
"Visual processing of objects makes use of both feedforward and feedback streams of information. However, the nature of feedback signals is largely unknown, as is the identity of the neuronal populations in lower visual areas that receive them. Here, we develop a recurrent neural model to address these questions in the context of contour integration and figure-ground segregation. A key feature of our model is the use of grouping neurons whose activity represents tentative objects (“proto-objects”) based on the integration of local feature information. Grouping neurons receive input from an organized set of local feature neurons, and project modulatory feedback to those same neurons. ..."
52.  Nodose sensory neuron (Schild et al. 1994, Schild and Kunze 1997)
This is a simulink implementation of the model described in Schild et al. 1994, and Schild and Kunze 1997 papers on Nodose sensory neurons. These papers describe the sensitivity these models have to their parameters and the match of the models to experimental data.
53.  Nonlinear neuronal computation based on physiologically plausible inputs (McFarland et al. 2013)
"... Here we present an approach for modeling sensory processing, termed the Nonlinear Input Model (NIM), which is based on the hypothesis that the dominant nonlinearities imposed by physiological mechanisms arise from rectification of a neuron’s inputs. Incorporating such ‘upstream nonlinearities’ within the standard linear-nonlinear (LN) cascade modeling structure implicitly allows for the identification of multiple stimulus features driving a neuron’s response, which become directly interpretable as either excitatory or inhibitory. Because its form is analogous to an integrate-and-fire neuron receiving excitatory and inhibitory inputs, model fitting can be guided by prior knowledge about the inputs to a given neuron, and elements of the resulting model can often result in specific physiological predictions. Furthermore, by providing an explicit probabilistic model with a relatively simple nonlinear structure, its parameters can be efficiently optimized and appropriately regularized. ... ”
54.  Phase-locking analysis with transcranial magneto-acoustical stimulation (Yuan et al 2017)
"Transcranial magneto-acoustical stimulation (TMAS) uses ultrasonic waves and a static magnetic field to generate electric current in nerve tissues for the purpose of modulating neuronal activities. It has the advantage of high spatial resolution and penetration depth. Neuronal firing rhythms carry and transmit nerve information in neural systems. In this study, we investigated the phase-locking characteristics of neuronal firing rhythms with TMAS based on the Hodgkin-Huxley neuron model. The simulation results indicate that the modulation frequency of ultrasound can affect the phase-locking behaviors. The results of this study may help us to explain the potential firing mechanism of TMAS."
55.  Polychronization: Computation With Spikes (Izhikevich 2005)
"We present a minimal spiking network that can polychronize, that is, exhibit reproducible time-locked but not synchronous firing patterns with millisecond precision, as in synfire braids. The network consists of cortical spiking neurons with axonal conduction delays and spiketiming- dependent plasticity (STDP); a ready-to-use MATLAB code is included. It exhibits sleeplike oscillations, gamma (40 Hz) rhythms, conversion of firing rates to spike timings, and other interesting regimes. ... To our surprise, the number of coexisting polychronous groups far exceeds the number of neurons in the network, resulting in an unprecedented memory capacity of the system. ..."
56.  Prefrontal cortical mechanisms for goal-directed behavior (Hasselmo 2005)
".. a model of prefrontal cortex function emphasizing the influence of goal-related activity on the choice of the next motor output. ... Different neocortical minicolumns represent distinct sensory input states and distinct motor output actions. The dynamics of each minicolumn include separate phases of encoding and retrieval. During encoding, strengthening of excitatory connections forms forward and reverse associations between each state, the following action, and a subsequent state, which may include reward. During retrieval, activity spreads from reward states throughout the network. The interaction of this spreading activity with a specific input state directs selection of the next appropriate action. Simulations demonstrate how these mechanisms can guide performance in a range of goal directed tasks, and provide a functional framework for some of the neuronal responses previously observed in the medial prefrontal cortex during performance of spatial memory tasks in rats."
57.  Prefrontal–striatal Parkinsons comp. model of multicue category learning (Moustafa and Gluck 2011)
"... In this model, PFC dopamine is key for attentional learning, whereas basal ganglia dopamine, consistent with other models, is key for reinforcement and motor learning. The model assumes that competitive dynamics among PFC neurons is the neural mechanism underlying stimulus selection with limited attentional resources, whereas competitive dynamics among striatal neurons is the neural mechanism underlying action selection. According to our model, PD is associated with decreased phasic and tonic dopamine levels in both PFC and basal ganglia. ..."
58.  Quantitative assessment of computational models for retinotopic map formation (Hjorth et al. 2015)
"Molecular and activity-based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. ..."
59.  Reduction of nonlinear ODE systems possessing multiple scales (Clewley et al. 2005)
" ... We introduce a combined numerical and analytical technique that aids the identification of structure in a class of systems of nonlinear ordinary differential equations (ODEs) that are commonly applied in dynamical models of physical processes. ... These methods have been incorporated into a new software tool named Dssrt, which we demonstrate on a limit cycle of a synaptically driven Hodgkin–Huxley neuron model."
60.  Response properties of an integrate and fire model (Zhang and Carney 2005)
"A computational technique is described for calculation of the interspike interval and poststimulus time histograms for the responses of an integrate-and-fire model to arbitrary inputs. ... For stationary inputs, the regularity of the output was studied in detail for various model parameters. For nonstationary inputs, the effects of the model parameters on the output synchronization index were explored. ... these response properties have been reported for some cells in the ventral cochlear nucleus in the auditory brainstem. "
61.  Role of KCNQ1 and IKs in cardiac repolarization (Silva, Rudy 2005)
Detailed Markov models of IKs (the slow delayed rectifier K+ current) and its alpha-subunit KCNQ1 were developed. The model is compared to experiment in the paper. The role of IKs in disease and drug treatments is elucidated (the prevention of excessive action potential prolongation and development of arrhythmogenic early afterdepolarizations). See paper for more and details.
62.  Squid axon (Hodgkin, Huxley 1952) (LabAXON)
The classic HH model of squid axon membrane implemented in LabAXON. Hodgkin, A.L., Huxley, A.F. (1952)
63.  Stimulated and physiologically induced APs: frequency and fiber diameter (Sadashivaiah et al 2018)
"... In this study, we aim to quantify the effects of stimulation frequency and fiber diameter on AP (Action Potential) interactions involving collisions and loss of excitability. We constructed a mechanistic model of a myelinated nerve fiber receiving two inputs: the underlying physiological activity at the terminal end of the fiber, and an external stimulus applied to the middle of the fiber. We define conduction reliability as the percentage of physiological APs that make it to the somatic end of the nerve fiber. At low input frequencies, conduction reliability is greater than 95% and decreases with increasing frequency due to an increase in AP interactions. Conduction reliability is less sensitive to fiber diameter and only decreases slightly with increasing fiber diameter. Finally, both the number and type of AP interactions significantly vary with both input frequencies and fiber diameter. ..."
64.  Sympathetic neuron (Wheeler et al 2004)
This study shows how synaptic convergence and plasticity can interact to generate synaptic gain in autonomic ganglia and thereby enhance homeostatic control. Using a conductance-based computational model of an idealized sympathetic neuron, we simulated the postganglionic response to noisy patterns of presynaptic activity and found that a threefold amplification in postsynaptic spike output can arise in ganglia, depending on the number and strength of nicotinic synapses, the presynaptic firing rate, the extent of presynaptic facilitation, and the expression of muscarinic and peptidergic excitation. See references for details.
65.  Synaptic damage underlies EEG abnormalities in postanoxic encephalopathy (Ruijter et al 2017)
"... In postanoxic coma, EEG patterns indicate the severity of encephalopathy and typically evolve in time. We aim to improve the understanding of pathophysiological mechanisms underlying these EEG abnormalities. ... We used a mean field model comprising excitatory and inhibitory neurons, local synaptic connections, and input from thalamic afferents. Anoxic damage is modeled as aggravated short-term synaptic depression, with gradual recovery over many hours. Additionally, excitatory neurotransmission is potentiated, scaling with the severity of anoxic encephalopathy. Simulations were compared with continuous EEG recordings of 155 comatose patients after cardiac arrest. ..."
66.  The dynamics underlying pseudo-plateau bursting in a pituitary cell model (Teka et al. 2011)
" ... pseudo-plateau bursts, are unlike bursts studied mathematically in neurons (plateau bursting) and the standard fast-slow analysis used for plateau bursting is of limited use. Using an alternative fast-slow analysis, with one fast and two slow variables, we show that pseudo-plateau bursting is a canard-induced mixed mode oscillation. ..." See paper for other results.
67.  Two-cell inhibitory network bursting dynamics captured in a one-dimensional map (Matveev et al 2007)
" ... Here we describe a simple method that allows us to investigate the existence and stability of anti-phase bursting solutions in a network of two spiking neurons, each possessing a T-type calcium current and coupled by reciprocal inhibition. We derive a one-dimensional map which fully characterizes the genesis and regulation of anti-phase bursting arising from the interaction of the T-current properties with the properties of synaptic inhibition. ..."
68.  Voltage and light-sensitive Channelrhodopsin-2 model (ChR2) (Williams et al. 2013)
" ... Focusing on one of the most widely used ChR2 mutants (H134R) with enhanced current, we collected a comprehensive experimental data set of the response of this ion channel to different irradiances and voltages, and used these data to develop a model of ChR2 with empirically-derived voltage- and irradiance- dependence, where parameters were fine-tuned via simulated annealing optimization. This ChR2 model offers: 1) accurate inward rectification in the current-voltage response across irradiances; 2) empirically-derived voltage- and light-dependent kinetics (activation, deactivation and recovery from inactivation); and 3) accurate amplitude and morphology of the response across voltage and irradiance settings. Temperature-scaling factors (Q10) were derived and model kinetics was adjusted to physiological temperatures. ... "

Re-display model names without descriptions