Models | Description | |

1. | A cardiac cell simulator (Puglisi and Bers 2001), applied to the QT interval (Busjahn et al 2004) | |

"LabHEART is an easy to use program that simulates the cardiac action potential, calcium transient and ionic currents. Key parameters such as ionic concentration, stimulus waveform and channel conductance can easily be changed by a click on an icon or dragging a slider. It is a powerfull tool for teaching and researching cardiac electrophysiology." | ||

2. | A comparison of mathematical models of mood in bipolar disorder (Cochran et al. 2017) | |

" ... we evaluated existing models of mood in BP (Bipolar Disorder) (...) and two new models we proposed here (...). Each model makes different assumptions about mood dynamics. Our objective was to differentiate between models using only time courses of mood. ..." | ||

3. | A CORF computational model of a simple cell that relies on LGN input (Azzopardi & Petkov 2012) | |

"... We propose a computational model that uses as afferent inputs the responses of model LGN cells with center-surround receptive fields (RFs) and we refer to it as a Combination of Receptive Fields (CORF) model. We use shifted gratings as test stimuli and simulated reverse correlation to explore the nature of the proposed model. We study its behavior regarding the effect of contrast on its response and orientation bandwidth as well as the effect of an orthogonal mask on the response to an optimally oriented stimulus. We also evaluate and compare the performances of the CORF and GF (Gabor Filter) models regarding contour detection, using two public data sets of images of natural scenes with associated contour ground truths. ... The proposed CORF model is more realistic than the GF model and is more effective in contour detection, which is assumed to be the primary biological role of simple cells." | ||

4. | A detailed and fast model of extracellular recordings (Camunas-Mesa & Qurioga 2013) | |

"We present a novel method to generate realistic simulations of extracellular recordings. The simulations were obtained by superimposing the activity of neurons placed randomly in a cube of brain tissue. Detailed models of individual neurons were used to reproduce the extracellular action potentials of close-by neurons. ..." | ||

5. | A dynamic model of the canine ventricular myocyte (Hund, Rudy 2004) | |

The Hund-Rudy dynamic (HRd) model is based on data from the canine epicardial ventricular myocyte. Rate-dependent phenomena associated with ion channel kinetics, action potential properties and Ca2+ handling are simulated by the model. See paper for more and details. | ||

6. | A gap junction network of Amacrine Cells controls Nitric Oxide release (Jacoby et al 2018) | |

"... The effects of the neuromodulator nitric oxide (NO) have been studied in many circuits, including in the vertebrate retina, where it regulates synaptic release, gap junction coupling, and blood vessel dilation, but little is known about the cells that release NO. We show that a single type of amacrine cell (AC) controls NO release in the inner retina, and we report its light responses, electrical properties, and calcium dynamics. We discover that this AC forms a dense gap junction network and that the strength of electrical coupling in the network is regulated by light through NO. A model of the network offers insights into the biophysical specializations leading to auto-regulation of NO release within the network." | ||

7. | A model for focal seizure onset, propagation, evolution, and progression (Liou et al 2020) | |

We developed a neural network model that can account for major elements common to human focal seizures. These include the tonic-clonic transition, slow advance of clinical semiology and corresponding seizure territory expansion, widespread EEG synchronization, and slowing of the ictal rhythm as the seizure approaches termination. These were reproduced by incorporating usage-dependent exhaustion of inhibition in an adaptive neural network that receives global feedback inhibition in addition to local recurrent projections. Our model proposes mechanisms that may underline common EEG seizure onset patterns and status epilepticus, and postulates a role for synaptic plasticity in the emergence of epileptic foci. Complex patterns of seizure activity and bi- stable seizure end-points arise when stochastic noise is included. With the rapid advancement of clinical and experimental tools, we believe that this model can provide a roadmap and potentially an in silico testbed for future explorations of seizure mechanisms and clinical therapies. | ||

8. | A neural mass model for critical assessment of brain connectivity (Ursino et al 2020) | |

We use a neural mass model of interconnected regions of interest to simulate reliable neuroelectrical signals in the cortex. In particular, signals simulating mean field potentials were generated assuming two, three or four ROIs, connected via excitatory or by-synaptic inhibitory links. Then we investigated whether bivariate Transfer Entropy (TE) can be used to detect a statistically significant connection from data (as in binary 0/1 networks), and even if connection strength can be quantified (i.e., the occurrence of a linear relationship between TE and connection strength). Results suggest that TE can reliably estimate the strength of connectivity if neural populations work in their linear regions. However, nonlinear phenomena dramatically affect the assessment of connectivity, since they may significantly reduce TE estimation. Software included here allows the simulation of neural mass models with a variable number of ROIs and connections, the estimation of TE using the free package Trentool, and the realization of figures to compare true connectivity with estimated values. | ||

9. | A neural mass model of cross frequency coupling (Chehelcheraghi et al 2017) | |

"Electrophysiological signals of cortical activity show a range of possible frequency and amplitude modulations, both within and across regions, collectively known as cross-frequency coupling. To investigate whether these modulations could be considered as manifestations of the same underlying mechanism, we developed a neural mass model. The model provides five out of the theoretically proposed six different coupling types. ..." | ||

10. | A neurocomputational model of classical conditioning phenomena (Moustafa et al. 2009) | |

"... Here, we show that the same information-processing function proposed for the hippocampal region in the Gluck and Myers (1993) model can also be implemented in a network without using the backpropagation algorithm. Instead, our newer instantiation of the theory uses only (a) Hebbian learning methods which match more closely with synaptic and associative learning mechanisms ascribed to the hippocampal region and (b) a more plausible representation of input stimuli. We demonstrate here that this new more biologically plausible model is able to simulate various behavioral effects, including latent inhibition, acquired equivalence, sensory preconditioning, negative patterning, and context shift effects. ..." | ||

11. | A reinforcement learning example (Sutton and Barto 1998) | |

This MATLAB script demonstrates an example of reinforcement learning functions guiding the movements of an agent (a black square) in a gridworld environment. See at the top of the matlab script and the book for more details. | ||

12. | A simple integrative electrophysiological model of bursting GnRH neurons (Csercsik et al. 2011) | |

In this paper a modular model of the GnRH neuron is presented. For the aim of simplicity, the currents corresponding to fast time scales and action potential generation are described by an impulsive system, while the slower currents and calcium dynamics are described by usual ordinary differential equations (ODEs). The model is able to reproduce the depolarizing afterpotentials, afterhyperpolarization, periodic bursting behavior and the corresponding calcium transients observed in the case of GnRH neurons. | ||

13. | Activity constraints on stable neuronal or network parameters (Olypher and Calabrese 2007) | |

"In this study, we developed a general description of parameter combinations for which specified characteristics of neuronal or network activity are constant. Our approach is based on the implicit function theorem and is applicable to activity characteristics that smoothly depend on parameters. Such smoothness is often intrinsic to neuronal systems when they are in stable functional states. The conclusions about how parameters compensate each other, developed in this study, can thus be used even without regard to the specific mathematical model describing a particular neuron or neuronal network. ..." | ||

14. | Analyzing neural time series data theory and practice (Cohen 2014) | |

"This book offers a comprehensive guide to the theory and practice of analyzing electrical brain signals. It explains the conceptual, mathematical, and implementational (via Matlab programming) aspects of time-, time-frequency- and synchronization-based analyses of magnetoencephalography (MEG), electroencephalography (EEG), and local field potential (LFP) recordings from humans and nonhuman animals." | ||

15. | Auditory nerve spontaneous rate histograms (Jackson and Carney 2005) | |

Histograms of spontaneous rate estimates of auditory nerve are well reproduced by models with two or three spontaneous rates and long range dependence. | ||

16. | Binocular energy model set for binocular neurons in optic lobe of praying mantis (Rosner et al 2019) | |

This is a version of the binocular energy model with parameters chosen to reproduce individual cells in praying mantis optic lobe. The receptive fields are very coarsely sampled (6 different horizontal locations only) to match the coarse sampling of the data given very limited recording time. | ||

17. | Brain Dynamics Toolbox (Heitmann & Breakspear 2016, 2017, 2018) | |

"The Brain Dynamics Toolbox is open-source software for simulating dynamical systems in neuroscience. It is for researchers and students who wish to explore mathematical models of brain function using Matlab. It includes a graphical tool for simulating dynamical systems in real-time as well as command-line tools for scripting large-scale simulations." | ||

18. | Cat auditory nerve model (Zilany and Bruce 2006, 2007) | |

"This paper presents a computational model to simulate normal and impaired auditory-nerve (AN) fiber responses in cats. The model responses match physiological data over a wider dynamic range than previous auditory models. This is achieved by providing two modes of basilar membrane excitation to the inner hair cell (IHC) rather than one. ... The model responses are consistent with a wide range of physiological data from both normal and impaired ears for stimuli presented at levels spanning the dynamic range of hearing." | ||

19. | Cerebellar stellate cells: changes in threshold, latency and frequency of firing (Mitry et al 2020) | |

"Cerebellar stellate cells are inhibitory molecular interneurons that regulate the firing properties of Purkinje cells, the sole output of cerebellar cortex. Recent evidence suggests that these cells exhibit temporal increase in excitability during whole-cell patch-clamp configuration in a phenomenon termed runup. They also exhibit a non-monotonic first-spike latency profile as a function of the holding potential in response to a fixed step-current. In this study, we use modeling approaches to unravel the dynamics of runup and categorize the firing behavior of cerebellar stellate cells as either type I or type II oscillators. We then extend this analysis to investigate how the non-monotonic latency profile manifests itself during runup. We employ a previously developed, but revised, Hodgkin–Huxley type model to show that stellate cells are indeed type I oscillators possessing a saddle node on an invariant cycle (SNIC) bifurcation. The SNIC in the model acts as a “threshold” for tonic firing and produces a slow region in the phase space called the ghost of the SNIC. The model reveals that (i) the SNIC gets left-shifted during runup with respect to I app = I test in the current-step protocol, and (ii) both the distance from the stable limit cycle along with the slow region produce the non-monotonic latency profile as a function of holding potential. Using the model, we elucidate how latency can be made arbitrarily large for a specific range of holding potentials close to the SNIC during pre-runup (post-runup). We also demonstrate that the model can produce transient single spikes in response to step- currents entirely below I SNIC , and that a pair of dynamic inhibitory and excitatory post- synaptic inputs can robustly evoke action potentials, provided that the magnitude of the inhibition is either low or high but not intermediate. Our results show that the topology of the SNIC is the key to explaining such behaviors." | ||

20. | Cochlear implant models (Bruce et al. 1999a, b, c, 2000) | |

"In a recent set of modeling studies we have developed a stochastic threshold model of auditory nerve response to single biphasic electrical pulses (Bruce et al., 1999c) and moderate rate (less than 800 pulses per second) pulse trains (Bruce et al., 1999a). In this article we derive an analytical approximation for the single-pulse model, which is then extended to describe the pulse-train model in the case of evenly timed, uniform pulses. This renewal-process description provides an accurate and computationally efficient model of electrical stimulation of single auditory nerve fibers by a cochlear implant that may be extended to other forms of electrical neural stimulation." | ||

21. | Continuous lateral oscillations as a mechanism for taxis in Drosophila larvae (Wystrach et al 2016) | |

" ...Our analysis of larvae motion reveals a rhythmic, continuous lateral oscillation of the anterior body, encompassing all head-sweeps, small or large, without breaking the oscillatory rhythm. Further, we show that an agent-model that embeds this hypothesis reproduces a surprising number of taxis signatures observed in larvae. Also, by coupling the sensory input to a neural oscillator in continuous time, we show that the mechanism is robust and biologically plausible. ..." | ||

22. | Disrupted information processing in Fmr1-KO mouse layer 4 barrel cortex (Domanski et al 2019) | |

"Sensory hypersensitivity is a common and debilitating feature of neurodevelopmental disorders such as Fragile X Syndrome (FXS). How developmental changes in neuronal function culminate in network dysfunction that underlies sensory hypersensitivities is unknown. By systematically studying cellular and synaptic properties of layer 4 neurons combined with cellular and network simulations, we explored how the array of phenotypes in Fmr1-knockout (KO) mice produce circuit pathology during development. We show that many of the cellular and synaptic pathologies in Fmr1-KO mice are antagonistic, mitigating circuit dysfunction, and hence may be compensatory to the primary pathology. Overall, the layer 4 network in the Fmr1-KO exhibits significant alterations in spike output in response to thalamocortical input and distorted sensory encoding. This developmental loss of layer 4 sensory encoding precision would contribute to subsequent developmental alterations in layer 4-to-layer 2/3 connectivity and plasticity observed in Fmr1-KO mice, and circuit dysfunction underlying sensory hypersensitivity." | ||

23. | Dynamics of sleep oscillations coupled to brain temperature on multiple scales (Csernai et al 2019) | |

"Every form of neural activity depends on temperature, yet its relationship to brain rhythms is poorly understood. In this work we examined how sleep spindles are influenced by changing brain temperatures and how brain temperature is influenced by sleep oscillations. We employed a novel thermoelectrode designed for measuring temperature while recording neural activity. We found that spindle frequency is positively correlated and duration negatively correlated with brain temperature. Local heating of the thalamus replicated the temperature dependence of spindle parameters in the heated area only, suggesting biophysical rather than global modulatory mechanisms, a finding also supported by a thalamic network model. Finally, we show that switches between oscillatory states also influence brain temperature on a shorter and smaller scale. Epochs of paradoxical sleep as well as the infra-slow oscillation were associated with brain temperature fluctuations below 0.2°C. Our results highlight that brain temperature is massively intertwined with sleep oscillations on various time scales." | ||

24. | DynaSim: a MATLAB toolbox for neural modeling and simulation (Sherfey et al 2018) | |

"DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems...." | ||

25. | Evaluation of stochastic diff. eq. approximation of ion channel gating models (Bruce 2009) | |

Fox and Lu derived an algorithm based on stochastic differential equations for approximating the kinetics of ion channel gating that is simpler and faster than "exact" algorithms for simulating Markov process models of channel gating. However, the approximation may not be sufficiently accurate to predict statistics of action potential generation in some cases. The objective of this study was to develop a framework for analyzing the inaccuracies and determining their origin. Simulations of a patch of membrane with voltage-gated sodium and potassium channels were performed using an exact algorithm for the kinetics of channel gating and the approximate algorithm of Fox & Lu. ... The results indicate that: (i) the source of the inaccuracy is that the Fox & Lu algorithm does not adequately describe the combined behavior of the multiple activation particles in each sodium and potassium channel, and (ii) the accuracy does not improve with increasing numbers of channels. | ||

26. | Fixed point attractor (Hasselmo et al 1995) | |

"... In the model, cholinergic suppression of synaptic transmission at excitatory feedback synapses is shown to determine the extent to which activity depends upon new features of the afferent input versus components of previously stored representations. ..." See paper for more and details. The MATLAB script demonstrates the model of fixed point attractors mediated by excitatory feedback with subtractive inhibition in a continuous firing rate model. | ||

27. | Gamma and theta rythms in biophysical models of hippocampus circuits (Kopell et al. 2011) | |

" ... the main rhythms displayed by the hippocampus, the gamma (30–90 Hz) and theta (4–12 Hz) rhythms. We concentrate on modeling in vitro experiments, but with an eye toward possible in vivo implications. ... We use simpler biophysical models; all cells have a single compartment only, and the interneurons are restricted to two types: fast-spiking (FS) basket cells and oriens lacunosum-moleculare (O-LM) cells. ... , we aim not so much at reproducing dynamics in great detail, but at clarifying the essential mechanisms underlying the production of the rhythms and their interactions (Kopell, 2005). ..." | ||

28. | Generating coherent patterns of activity from chaotic neural networks (Sussillo and Abbott 2009) | |

"Neural circuits display complex activity patterns both spontaneously and when responding to a stimulus or generating a motor output. How are these two forms of activity related? We develop a procedure called FORCE learning for modifying synaptic strengths either external to or within a model neural network to change chaotic spontaneous activity into a wide variety of desired activity patterns. ... Our results reproduce data on premovement activity in motor and premotor cortex, and suggest that synaptic plasticity may be a more rapid and powerful modulator of network activity than generally appreciated." | ||

29. | High-Res. Recordings Using a Real-Time Computational Model of the Electrode (Brette et al. 2008) | |

"Intracellular recordings of neuronal membrane potential are a central tool in neurophysiology. ... We introduce a computer-aided technique, Active Electrode Compensation (AEC), based on a digital model of the electrode interfaced in real time with the electrophysiological setup. ... AEC should be particularly useful to characterize fast neuronal phenomena intracellularly in vivo." | ||

30. | Hippocampal context-dependent retrieval (Hasselmo and Eichenbaum 2005) | |

"... The model simulates the context-sensitive firing properties of hippocampal neurons including trial-specific firing during spatial alternation and trial by trial changes in theta phase precession on a linear track. ..." See paper for more and details. | ||

31. | Hodgkin–Huxley model with fractional gating (Teka et al. 2016) | |

We use fractional order derivatives to model the kinetic dynamics of the gate variables for the potassium and sodium conductances of the Hodgkin-Huxley model. Our results show that power-law dynamics of the different gate variables result in a wide range of action potential shapes and spiking patterns, even in the case where the model was stimulated with constant current. As a consequence, power-law behaving conductances result in an increase in the number of spiking patterns a neuron can generate and, we propose, expand the computational capacity of the neuron. | ||

32. | Human seizures couple across spatial scales through travelling wave dynamics (Martinet et al 2017) | |

" ... We show that during seizure large-scale neural populations spanning centimetres of cortex coordinate with small neural groups spanning cortical columns, and provide evidence that rapidly propagating waves of activity underlie this increased inter-scale coupling. We develop a corresponding computational model to propose specific mechanisms—namely, the effects of an increased extracellular potassium concentration diffusing in space—that support the observed spatiotemporal dynamics. Understanding the multi-scale, spatiotemporal dynamics of human seizures—and connecting these dynamics to specific biological mechanisms—promises new insights to treat this devastating disease. | ||

33. | Implementation issues in approximate methods for stochastic Hodgkin-Huxley models (Bruce 2007) | |

Four different algorithms for implementing Hodgkin–Huxley models with stochastic sodium channels: Strassberg and DeFelice (1993), Rubinstein (1995), Chow and White (1996), and Fox (1997) are compared. | ||

34. | Inhibitory cells enable sparse coding in V1 model (King et al. 2013) | |

" ... Here we show that adding a separate population of inhibitory neurons to a spiking model of V1 provides conformance to Dale’s Law, proposes a computational role for at least one class of interneurons, and accounts for certain observed physiological properties in V1. ... " | ||

35. | Integrate and fire model code for spike-based coincidence-detection (Heinz et al. 2001, others) | |

Model code relevant to three papers; two on level discrimination and one on masked detection at low frequencies. | ||

36. | Levodopa-Induced Toxicity in Parkinson's Disease (Muddapu et al, 2022) | |

"... We present a systems-level computational model of SNc-striatum, which will help us understand the mechanism behind neurodegeneration postulated above and provide insights into developing disease-modifying therapeutics. It was observed that SNc terminals are more vulnerable to energy deficiency than SNc somas. During L-DOPA therapy, it was observed that higher L-DOPA dosage results in increased loss of terminals in SNc. It was also observed that co-administration of L-DOPA and glutathione (antioxidant) evades L-DOPA-induced toxicity in SNc neurons. Our proposed model of the SNc-striatum system is the first of its kind, where SNc neurons were modeled at a biophysical level, and striatal neurons were modeled at a spiking level. We show that our proposed model was able to capture L-DOPA-induced toxicity in SNc, caused by energy deficiency." | ||

37. | Logarithmic distributions prove that intrinsic learning is Hebbian (Scheler 2017) | |

"In this paper, we present data for the lognormal distributions of spike rates, synaptic weights and intrinsic excitability (gain) for neurons in various brain areas, such as auditory or visual cortex, hippocampus, cerebellum, striatum, midbrain nuclei. We find a remarkable consistency of heavy-tailed, specifically lognormal, distributions for rates, weights and gains in all brain areas examined. The difference between strongly recurrent and feed-forward connectivity (cortex vs. striatum and cerebellum), neurotransmitter (GABA (striatum) or glutamate (cortex)) or the level of activation (low in cortex, high in Purkinje cells and midbrain nuclei) turns out to be irrelevant for this feature. Logarithmic scale distribution of weights and gains appears to be a general, functional property in all cases analyzed. We then created a generic neural model to investigate adaptive learning rules that create and maintain lognormal distributions. We conclusively demonstrate that not only weights, but also intrinsic gains, need to have strong Hebbian learning in order to produce and maintain the experimentally attested distributions. This provides a solution to the long-standing question about the type of plasticity exhibited by intrinsic excitability." | ||

38. | Long-term adaptation with power-law dynamics (Zilany et al. 2009) | |

... A model of rate adaptation at the synapse between inner hair cells and auditory-nerve (AN) fibers that includes both exponential and power-law dynamics is presented here. Exponentially adapting components with rapid and short-term time constants, which are mainly responsible for shaping onset responses, are followed by two parallel paths with power-law adaptation that provide slowly and rapidly adapting responses. ... The proposed model is capable of accurately predicting several sets of AN data, including amplitude-modulation transfer functions, long-term adaptation, forward masking, and adaptation to increments and decrements in the amplitude of an ongoing stimulus. | ||

39. | Loss of phase-locking in non-weakly coupled inhib. networks of type-I neurons (Oh and Matveev 2009) | |

... Here we examine the loss of synchrony caused by an increase in inhibitory coupling in networks of type-I Morris–Lecar model oscillators, which is characterized by a period-doubling cascade and leads to mode-locked states with alternation in the firing order of the two cells, as reported recently by Maran and Canavier (J Comput Nerosci, 2008) for a network of Wang-Buzsáki model neurons. Although alternating-order firing has been previously reported as a near-synchronous state, we show that the stable phase difference between the spikes of the two Morris–Lecar cells can constitute as much as 70% of the unperturbed oscillation period. Further, we examine the generality of this phenomenon for a class of type-I oscillators that are close to their excitation thresholds, and provide an intuitive geometric description of such “leap-frog” dynamics. ..." | ||

40. | Mathematics for Neuroscientists (Gabbiani and Cox 2010) | |

This textbook provides a good source for learning the mathematics relevant to computational neuroscience and also the neuroscience itself. There are 232 computer code examples from the book available through the http://www.elsevierdirect.com/companions/9780123748829/pictures/code/index.html code link here and in the below page copied from the books companion web site. | ||

41. | MATLAB for brain and cognitive scientists (Cohen 2017) | |

" ... MATLAB for Brain and Cognitive Scientists takes readers from beginning to intermediate and advanced levels of MATLAB programming, helping them gain real expertise in applications that they will use in their work. The book offers a mix of instructive text and rigorous explanations of MATLAB code along with programming tips and tricks. The goal is to teach the reader how to program data analyses in neuroscience and psychology. Readers will learn not only how to but also how not to program, with examples of bad code that they are invited to correct or improve. Chapters end with exercises that test and develop the skills taught in each chapter. Interviews with neuroscientists and cognitive scientists who have made significant contributions to their field using MATLAB appear throughout the book. ..." | ||

42. | Mature and young adult-born dentate granule cell models (T2N interface) (Beining et al. 2017) | |

... Here, we present T2N, a powerful interface to control NEURON with Matlab and TREES toolbox, which supports generating models stable over a broad range of reconstructed and synthetic morphologies. We illustrate this for a novel, highly-detailed active model of dentate granule cells (GCs) replicating a wide palette of experiments from various labs. By implementing known differences in ion channel composition and morphology, our model reproduces data from mouse or rat, mature or adult-born GCs as well as pharmacological interventions and epileptic conditions. ... T2N is suitable for creating robust models useful for large-scale networks that could lead to novel predictions. ..." See modeldb accession number 231818 for NEURON only code. | ||

43. | Method for counting motor units in mice (Major et al 2007) | |

"... Our goal was to develop an efficient method to determine the number of motor neurons making functional connections to muscle in a transgenic mouse model of amyotrophic lateral sclerosis (ALS). We developed a novel protocol for motor unit number estimation (MUNE) using incremental stimulation. The method involves analysis of twitch waveforms using a new software program, ITS-MUNE, designed for interactive calculation of motor unit number. The method was validated by testing simulated twitch data from a mathematical model of the neuromuscular system. Computer simulations followed the same stimulus-response protocol and produced waveform data that were indistinguishable from experiments. ... The ITS-MUNE analysis method has the potential to quantitatively measure the progression of motor neuron diseases and therefore the efficacy of treatments designed to alleviate pathologic processes of muscle denervation." The software is available for download under the "ITS-MUNE software" link at (see below for links)." | ||

44. | Method of probabilistic principle surfaces (PPS) (Chang and Ghosh 2001) | |

Principal curves and surfaces are nonlinear generalizations of principal components and subspaces, respectively. They can provide insightful summary of high-dimensional data not typically attainable by classical linear methods. See paper for more and details. The matlab code supplied at the authors website calculates probabilistic principle surfaces on benchmark data sets. | ||

45. | Microglial cytokine network (Anderson et al., 2015) | |

This is an ODE model of autocrine/paracrine microglial cytokine interactions. Simulations include analyses of neuroinflammation mechanisms in the context of adaptation and tolerance to LPS. | ||

46. | Model of generalized periodic discharges in acute hepatic encephalopathy (Song et al 2019) | |

"Acute hepatic encephalopathy (AHE) due to acute liver failure is a common form of delirium, a state of confusion, impaired attention, and decreased arousal. The electroencephalogram (EEG) in AHE often exhibits a striking abnormal pattern of brain activity, which epileptiform discharges repeat in a regular repeating pattern. This pattern is known as generalized periodic discharges, or triphasic-waves (TPWs). While much is known about the neurophysiological mechanisms underlying AHE, how these mechanisms relate to TPWs is poorly understood. In order to develop hypotheses how TPWs arise, our work builds a computational model of AHE (AHE-CM), based on three modifications of the well-studied Liley model which emulate mechanisms believed central to brain dysfunction in AHE: increased neuronal excitability, impaired synaptic transmission, and enhanced postsynaptic inhibition..." | ||

47. | Model of neural responses to amplitude-modulated tones (Nelson and Carney 2004) | |

"A phenomenological model with time-varying excitation and inhibition was developed to study possible neural mechanisms underlying changes in the representation of temporal envelopes along the auditory pathway. A modified version of an existing auditory-nerve model (Zhang et al., J. Acoust. Soc. Am. 109, 648–670 (2001) was used to provide inputs to higher hypothetical processing centers. Model responses were compared directly to published physiological data at three levels: the auditory nerve, ventral cochlear nucleus, and inferior colliculus. ..." | ||

48. | Modeling conductivity profiles in the deep neocortical pyramidal neuron (Wang K et al. 2013) | |

"With the rapid increase in the number of technologies aimed at observing electric activity inside the brain, scientists have felt the urge to create proper links between intracellular- and extracellular-based experimental approaches. Biophysical models at both physical scales have been formalized under assumptions that impede the creation of such links. In this work, we address this issue by proposing amulticompartment model that allows the introduction of complex extracellular and intracellular resistivity profiles. This model accounts for the geometrical and electrotonic properties of any type of neuron through the combination of four devices: the integrator, the propagator, the 3D connector, and the collector. ..." | ||

49. | Models analysis for auditory-nerve synapse (Zhang and Carney 2005) | |

"A general mathematical approach was proposed to study phenomenological models of the inner-hair-cell and auditory-nerve (AN) synapse complex. Two models (Meddis, 1986; Westerman and Smith, 1988) were studied using this unified approach. The responses of both models to a constant-intensity stimulus were described mathematically, and the relationship between model parameters and response characteristics was investigated. ...". The paper then modifies these to make a more physiologically realistic model. | ||

50. | Models for diotic and dichotic detection (Davidson et al. 2009) | |

Several psychophysical models for masked detection were evaluated using reproducible noises. The data were hit and false-alarm rates from three psychophysical studies of detection of 500-Hz tones in reproducible noise under diotic (N0S0) and dichotic (N0Spi) conditions with four stimulus bandwidths (50, 100, 115, and 2900 Hz). Diotic data were best predicted by an energy-based multiple-detector model that linearly combined stimulus energies at the outputs of several critical-band filters. The tone-plus-noise trials in the dichotic data were best predicted by models that linearly combined either the average values or the standard deviations of interaural time and level differences; however, these models offered no predictions for noise-alone responses. ...". The Breebart et al. 2001 and the Dau et al. 1996 models are supplied at the Carney lab web site. | ||

51. | Multi-timescale adaptive threshold model (Kobayashi et al 2009) | |

" ... In this study, we devised a simple, fast computational model that can be tailored to any cortical neuron not only for reproducing but also for predicting a variety of spike responses to greatly fluctuating currents. The key features of this model are a multi-timescale adaptive threshold predictor and a nonresetting leaky integrator. This model is capable of reproducing a rich variety of neuronal spike responses, including regular spiking, intrinsic bursting, fast spiking, and chattering, by adjusting only three adaptive threshold parameters. ..." | ||

52. | Multiscale model of excitotoxicity in PD (Muddapu and Chakravarthy 2020) | |

Parkinson's disease (PD) is a neurodegenerative disorder caused by loss of dopaminergic neurons in Substantia Nigra pars compacta (SNc). Although the exact cause of cell death is not clear, the hypothesis that metabolic deficiency is a key factor has been gaining attention in recent years. In the present study, we investigate this hypothesis using a multi-scale computational model of the subsystem of the basal ganglia comprising Subthalamic Nucleus (STN), Globus Pallidus externa (GPe) and SNc. The proposed model is a multiscale model in that interactions among the three nuclei are simulated using more abstract Izhikevich neuron models, while the molecular pathways involved in cell death of SNc neurons are simulated in terms of detailed chemical kinetics. Simulation results obtained from the proposed model showed that energy deficiencies occurring at cellular and network levels could precipitate the excitotoxic loss of SNc neurons in PD. At the subcellular level, the models show how calcium elevation leads to apoptosis of SNc neurons. The therapeutic effects of several neuroprotective interventions are also simulated in the model. From neuroprotective studies, it was clear that glutamate inhibition and apoptotic signal blocker therapies were able to halt the progression of SNc cell loss when compared to other therapeutic interventions, which only slows down the progression of SNc cell loss. | ||

53. | Multistability of clustered states in a globally inhibitory network (Chandrasekaran et al. 2009) | |

"We study a network of m identical excitatory cells projecting excitatory synaptic connections onto a single inhibitory interneuron, which is reciprocally coupled to all excitatory cells through inhibitory synapses possessing short-term synaptic depression. We find that such a network with global inhibition possesses multiple stable activity patterns with distinct periods, characterized by the clustering of the excitatory cells into synchronized sub-populations. We prove the existence and stability of n-cluster solutions in a m-cell network. ... Implications for temporal coding and memory storage are discussed." | ||

54. | Network topologies for producing limited sustained activation (Kaiser and Hilgetag 2010) | |

Uses networks of cellular automata to test hypotheses about network topologies that can produce limited, sustained activity. Inspired by empirically-based ideas about neocortical architecture, but conceived and implemented at a level of abstraction that is not closely linked to empirical observations. | ||

55. | Neural field model to reconcile structure with function in V1 (Rankin & Chavane 2017) | |

"Voltage-sensitive dye imaging experiments in primary visual cortex (V1) have shown that local, oriented visual stimuli elicit stable orientation-selective activation within the stimulus retinotopic footprint. The cortical activation dynamically extends far beyond the retinotopic footprint, but the peripheral spread stays non-selective—a surprising finding given a number of anatomo-functional studies showing the orientation specificity of long-range connections. Here we use a computational model to investigate this apparent discrepancy by studying the expected population response using known published anatomical constraints. The dynamics of input-driven localized states were simulated in a planar neural field model with multiple sub-populations encoding orientation. The realistic connectivity profile has parameters controlling the clustering of long-range connections and their orientation bias. We found substantial overlap between the anatomically relevant parameter range and a steep decay in orientation selective activation that is consistent with the imaging experiments. In this way our study reconciles the reported orientation bias of long-range connections with the functional expression of orientation selective neural activity. Our results demonstrate this sharp decay is contingent on three factors, that long-range connections are sufficiently diffuse, that the orientation bias of these connections is in an intermediate range (consistent with anatomy) and that excitation is sufficiently balanced by inhibition. Conversely, our modelling results predict that, for reduced inhibition strength, spurious orientation selective activation could be generated through long-range lateral connections. Furthermore, if the orientation bias of lateral connections is very strong, or if inhibition is particularly weak, the network operates close to an instability leading to unbounded cortical activation. ..." | ||

56. | Neural Mass Model for relationship between Brain Rhythms + Functional Connectivity (Ricci et al '21) | |

The Neural Mass Model (NMM) generates biologically reliable mean field potentials of four interconnected regions of interest (ROIs) of the cortex, each simulating a different brain rhythm (in theta, alpha, beta and gamma ranges). These neuroelectrical signals originate from the assumption that ROIs influence each other via of excitatory or by-synaptic inhibitory connections. Besides receiving long-range synapses from other ROIs, each one receives an external input and superimposed Gaussian white noise. We used the NMM to simulate different connectivity networks of four ROIs, by varying both the synaptic strengths and the inputs. The purpose of this study is to investigate how the transmission of brain rhythms behaves under linear and nonlinear conditions. To this aim, we investigated the performance of eight Functional Connectivity (FC) estimators (Correlation, Delayed Correlation, Coherence, Lagged Coherence, Temporal Granger Causality, Spectral Granger Causality, Phase Synchronization and Transfer Entropy) in detecting the connectivity network changes. Results suggest that when a ROI works in the linear region, its capacity to transmit its rhythm increases, while when it saturates, the oscillatory activity becomes strongly affected by other ROIs. Software included here allows the simulation of mean field potentials of four interconnected ROIs, their visualization, both in time and frequency domains, and the estimation of the related FC with eight different methods (for Transfer Entropy the Trentool package is needed). | ||

57. | Neural mass model of spindle generation in the isolated thalamus (Schellenberger Costa et al. 2016) | |

The model generates different oscillatory patterns in the thalamus, including delta and spindle band oscillations. | ||

58. | Neural mass model of the neocortex under sleep regulation (Costa et al 2016) | |

This model generates typical human EEG patterns of sleep stages N2/N3 as well as wakefulness and REM. It further contains a sleep regulatory component, that lets the model transition between those stages independently | ||

59. | Neural mass model of the sleeping thalamocortical system (Schellenberger Costa et al 2016) | |

This paper generates typical human EEG data of sleep stages N2/N3 as well as wakefulness and REM sleep. | ||

60. | Neural model of two-interval discrimination (Machens et al 2005) | |

Two-interval discrimination involves comparison of two stimuli that are presented at different times. It has three phases: loading, in which the first stimulus is perceived and stored in working memory; maintenance of working memory; decision making, in which the second stimulus is perceived and compared with the first. In behaving monkeys, each phase is associated with characteristic firing activity of neurons in the prefrontal cortex. This model implements both working memory and decision making with a mutual inhibition network that reproduces all three phases of two-interval discrimination. Machens, C.K., Romo, R., and Brody, C.D. Flexible control of mutual inhibition: a neural model of two-interval discrimination. Science 307:1121-1124, 2005. | ||

61. | Neural recruitment during synchronous multichannel microstimulation (Hokanson et al 2018) | |

" ...The effects of field interactions on neuronal recruitment depend on several factors, which have been studied extensively at the macro-scale but have been overlooked in the case of high density arrays. Here, we report that field interactions can significantly affect neural recruitment, even with low amplitude stimulation. We created a computational model of peripheral nerve axons to estimate stimulation parameters sufficient to generate neural recruitment during synchronous and asynchronous stimulation on two microelectrodes located within the peripheral nerve. Across a range of stimulus amplitudes, the model predicted that synchronous stimulation on adjacent electrodes (400 µm separation), would recruit 2-3 times more neurons than during asynchronous stimulation. ..." | ||

62. | NeuroManager: a workflow analysis based simulation management engine (Stockton & Santamaria 2015) | |

"We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. ..." | ||

63. | NN for proto-object based contour integration and figure-ground segregation (Hu & Niebur 2017) | |

"Visual processing of objects makes use of both feedforward and feedback streams of information. However, the nature of feedback signals is largely unknown, as is the identity of the neuronal populations in lower visual areas that receive them. Here, we develop a recurrent neural model to address these questions in the context of contour integration and figure-ground segregation. A key feature of our model is the use of grouping neurons whose activity represents tentative objects (“proto-objects”) based on the integration of local feature information. Grouping neurons receive input from an organized set of local feature neurons, and project modulatory feedback to those same neurons. ..." | ||

64. | Nodose sensory neuron (Schild et al. 1994, Schild and Kunze 1997) | |

This is a simulink implementation of the model described in Schild et al. 1994, and Schild and Kunze 1997 papers on Nodose sensory neurons. These papers describe the sensitivity these models have to their parameters and the match of the models to experimental data. | ||

65. | Nonlinear neuronal computation based on physiologically plausible inputs (McFarland et al. 2013) | |

"... Here we present an approach for modeling sensory processing, termed the Nonlinear Input Model (NIM), which is based on the hypothesis that the dominant nonlinearities imposed by physiological mechanisms arise from rectification of a neuron’s inputs. Incorporating such ‘upstream nonlinearities’ within the standard linear-nonlinear (LN) cascade modeling structure implicitly allows for the identification of multiple stimulus features driving a neuron’s response, which become directly interpretable as either excitatory or inhibitory. Because its form is analogous to an integrate-and-fire neuron receiving excitatory and inhibitory inputs, model fitting can be guided by prior knowledge about the inputs to a given neuron, and elements of the resulting model can often result in specific physiological predictions. Furthermore, by providing an explicit probabilistic model with a relatively simple nonlinear structure, its parameters can be efficiently optimized and appropriately regularized. ... ” | ||

66. | Phase-locking analysis with transcranial magneto-acoustical stimulation (Yuan et al 2017) | |

"Transcranial magneto-acoustical stimulation (TMAS) uses ultrasonic waves and a static magnetic field to generate electric current in nerve tissues for the purpose of modulating neuronal activities. It has the advantage of high spatial resolution and penetration depth. Neuronal firing rhythms carry and transmit nerve information in neural systems. In this study, we investigated the phase-locking characteristics of neuronal firing rhythms with TMAS based on the Hodgkin-Huxley neuron model. The simulation results indicate that the modulation frequency of ultrasound can affect the phase-locking behaviors. The results of this study may help us to explain the potential firing mechanism of TMAS." | ||

67. | Polychronization: Computation With Spikes (Izhikevich 2005) | |

"We present a minimal spiking network that can polychronize, that is, exhibit reproducible time-locked but not synchronous firing patterns with millisecond precision, as in synfire braids. The network consists of cortical spiking neurons with axonal conduction delays and spiketiming- dependent plasticity (STDP); a ready-to-use MATLAB code is included. It exhibits sleeplike oscillations, gamma (40 Hz) rhythms, conversion of firing rates to spike timings, and other interesting regimes. ... To our surprise, the number of coexisting polychronous groups far exceeds the number of neurons in the network, resulting in an unprecedented memory capacity of the system. ..." | ||

68. | Prefrontal cortical mechanisms for goal-directed behavior (Hasselmo 2005) | |

".. a model of prefrontal cortex function emphasizing the influence of goal-related activity on the choice of the next motor output. ... Different neocortical minicolumns represent distinct sensory input states and distinct motor output actions. The dynamics of each minicolumn include separate phases of encoding and retrieval. During encoding, strengthening of excitatory connections forms forward and reverse associations between each state, the following action, and a subsequent state, which may include reward. During retrieval, activity spreads from reward states throughout the network. The interaction of this spreading activity with a specific input state directs selection of the next appropriate action. Simulations demonstrate how these mechanisms can guide performance in a range of goal directed tasks, and provide a functional framework for some of the neuronal responses previously observed in the medial prefrontal cortex during performance of spatial memory tasks in rats." | ||

69. | Prefrontal–striatal Parkinsons comp. model of multicue category learning (Moustafa and Gluck 2011) | |

"... In this model, PFC dopamine is key for attentional learning, whereas basal ganglia dopamine, consistent with other models, is key for reinforcement and motor learning. The model assumes that competitive dynamics among PFC neurons is the neural mechanism underlying stimulus selection with limited attentional resources, whereas competitive dynamics among striatal neurons is the neural mechanism underlying action selection. According to our model, PD is associated with decreased phasic and tonic dopamine levels in both PFC and basal ganglia. ..." | ||

70. | Quantitative assessment of computational models for retinotopic map formation (Hjorth et al. 2015) | |

"Molecular and activity-based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. ..." | ||

71. | Reduction of nonlinear ODE systems possessing multiple scales (Clewley et al. 2005) | |

" ... We introduce a combined numerical and analytical technique that aids the identification of structure in a class of systems of nonlinear ordinary differential equations (ODEs) that are commonly applied in dynamical models of physical processes. ... These methods have been incorporated into a new software tool named Dssrt, which we demonstrate on a limit cycle of a synaptically driven Hodgkin–Huxley neuron model." | ||

72. | Response properties of an integrate and fire model (Zhang and Carney 2005) | |

"A computational technique is described for calculation of the interspike interval and poststimulus time histograms for the responses of an integrate-and-fire model to arbitrary inputs. ... For stationary inputs, the regularity of the output was studied in detail for various model parameters. For nonstationary inputs, the effects of the model parameters on the output synchronization index were explored. ... these response properties have been reported for some cells in the ventral cochlear nucleus in the auditory brainstem. " | ||

73. | Role of KCNQ1 and IKs in cardiac repolarization (Silva, Rudy 2005) | |

Detailed Markov models of IKs (the slow delayed rectifier K+ current) and its alpha-subunit KCNQ1 were developed. The model is compared to experiment in the paper. The role of IKs in disease and drug treatments is elucidated (the prevention of excessive action potential prolongation and development of arrhythmogenic early afterdepolarizations). See paper for more and details. | ||

74. | Single-cell comprehensive biophysical model of SN pars compacta (Muddapu & Chakravarthy 2021) | |

Parkinson’s disease (PD) is caused by the loss of dopaminergic cells in substantia nigra pars compacta (SNc), the decisive cause of this inexorable cell loss is not clearly elucidated. We hypothesize that “Energy deficiency at a sub-cellular/cellular/systems-level can be a common underlying cause for SNc cell loss in PD.” Here, we propose a comprehensive computational model of SNc cell which helps us to understand the pathophysiology of neurodegeneration at subcellular-level in PD. We were able to show see how deficits in supply of energy substrates (glucose and oxygen) lead to a deficit in ATP, and furthermore, deficits in ATP are the common factor underlying the pathological molecular-level changes including alpha-synuclein aggregation, ROS formation, calcium elevation, and dopamine dysfunction. The model also suggests that hypoglycemia plays a more crucial role in leading to ATP deficits than hypoxia. We believe that the proposed model provides an integrated modelling framework to understand the neurodegenerative processes underlying PD. | ||

75. | Squid axon (Hodgkin, Huxley 1952) (LabAXON) | |

The classic HH model of squid axon membrane implemented in LabAXON. Hodgkin, A.L., Huxley, A.F. (1952) | ||

76. | Stimulated and physiologically induced APs: frequency and fiber diameter (Sadashivaiah et al 2018) | |

"... In this study, we aim to quantify the effects of stimulation frequency and fiber diameter on AP (Action Potential) interactions involving collisions and loss of excitability. We constructed a mechanistic model of a myelinated nerve fiber receiving two inputs: the underlying physiological activity at the terminal end of the fiber, and an external stimulus applied to the middle of the fiber. We define conduction reliability as the percentage of physiological APs that make it to the somatic end of the nerve fiber. At low input frequencies, conduction reliability is greater than 95% and decreases with increasing frequency due to an increase in AP interactions. Conduction reliability is less sensitive to fiber diameter and only decreases slightly with increasing fiber diameter. Finally, both the number and type of AP interactions significantly vary with both input frequencies and fiber diameter. ..." | ||

77. | Sympathetic neuron (Wheeler et al 2004) | |

This study shows how synaptic convergence and plasticity can interact to generate synaptic gain in autonomic ganglia and thereby enhance homeostatic control. Using a conductance-based computational model of an idealized sympathetic neuron, we simulated the postganglionic response to noisy patterns of presynaptic activity and found that a threefold amplification in postsynaptic spike output can arise in ganglia, depending on the number and strength of nicotinic synapses, the presynaptic firing rate, the extent of presynaptic facilitation, and the expression of muscarinic and peptidergic excitation. See references for details. | ||

78. | Synaptic damage underlies EEG abnormalities in postanoxic encephalopathy (Ruijter et al 2017) | |

"... In postanoxic coma, EEG patterns indicate the severity of encephalopathy and typically evolve in time. We aim to improve the understanding of pathophysiological mechanisms underlying these EEG abnormalities. ... We used a mean field model comprising excitatory and inhibitory neurons, local synaptic connections, and input from thalamic afferents. Anoxic damage is modeled as aggravated short-term synaptic depression, with gradual recovery over many hours. Additionally, excitatory neurotransmission is potentiated, scaling with the severity of anoxic encephalopathy. Simulations were compared with continuous EEG recordings of 155 comatose patients after cardiac arrest. ..." | ||

79. | Synaptic strengths are critical in creating the proper output phasing in a CPG (Gunay et al 2019) | |

"Identified neurons and the networks they compose produce stereotypical, albeit individually unique, activity across members of a species. We propose, for a motor circuit driven by a central pattern generator (CPG), that the uniqueness derives mainly from differences in synaptic strength rather than from differences in intrinsic membrane conductances. We studied a dataset of recordings from six leech (Hirudo sp.) heartbeat control networks, containing complete spiking activity patterns from inhibitory premotor interneurons, motor output spike patterns, and synaptic strength patterns to investigate the source of uniqueness. We used a conductance-based multicompartmental motor neuron model to construct a bilateral motor circuit model, and controlled it by playing recorded input spike trains from premotor interneurons to generate output inhibitory synaptic patterns similar to experimental measurements. By generating different synaptic conductance parameter sets of this circuit model, we found that relative premotor synaptic strengths impinging onto motor neurons must be different across individuals to produce animal-specific output burst phasing. Obtaining unique outputs from each individual’s circuit model did not require different intrinsic ionic conductance parameters. Furthermore, changing intrinsic conductances failed to compensate for modified synaptic strength patterns. ..." | ||

80. | The dynamics underlying pseudo-plateau bursting in a pituitary cell model (Teka et al. 2011) | |

" ... pseudo-plateau bursts, are unlike bursts studied mathematically in neurons (plateau bursting) and the standard fast-slow analysis used for plateau bursting is of limited use. Using an alternative fast-slow analysis, with one fast and two slow variables, we show that pseudo-plateau bursting is a canard-induced mixed mode oscillation. ..." See paper for other results. | ||

81. | Two-cell inhibitory network bursting dynamics captured in a one-dimensional map (Matveev et al 2007) | |

" ... Here we describe a simple method that allows us to investigate the existence and stability of anti-phase bursting solutions in a network of two spiking neurons, each possessing a T-type calcium current and coupled by reciprocal inhibition. We derive a one-dimensional map which fully characterizes the genesis and regulation of anti-phase bursting arising from the interaction of the T-current properties with the properties of synaptic inhibition. ..." | ||

82. | Voltage and light-sensitive Channelrhodopsin-2 model (ChR2) (Williams et al. 2013) | |

" ... Focusing on one of the most widely used ChR2 mutants (H134R) with enhanced current, we collected a comprehensive experimental data set of the response of this ion channel to different irradiances and voltages, and used these data to develop a model of ChR2 with empirically-derived voltage- and irradiance- dependence, where parameters were fine-tuned via simulated annealing optimization. This ChR2 model offers: 1) accurate inward rectification in the current-voltage response across irradiances; 2) empirically-derived voltage- and light-dependent kinetics (activation, deactivation and recovery from inactivation); and 3) accurate amplitude and morphology of the response across voltage and irradiance settings. Temperature-scaling factors (Q10) were derived and model kinetics was adjusted to physiological temperatures. ... " |