Rhesus Monkey Layer 3 Pyramidal Neurons: Young vs aged PFC (Coskren et al. 2015)

 Download zip file 
Help downloading and running models
Layer 3 (L3) pyramidal neurons in the lateral prefrontal cortex (LPFC) of rhesus monkeys exhibit dendritic regression, spine loss and increased action potential (AP) firing rates during normal aging. The relationship between these structural and functional alterations, if any, is unknown. Computational models using the digital reconstructions with Hodgkin-Huxley and AMPA channels allowed us to assess relationships between demonstrated age-related changes and to predict physiological changes that have not yet been tested empirically. Tuning passive parameters for each model predicted significantly higher membrane resistance (Rm) in aged versus young neurons. This Rm increase alone did not account for the empirically observed fI-curves, but coupling these Rm values with subtle differences in morphology and membrane capacitance Cm did. The predicted differences in passive parameters (or other parameters with similar effects) are mathematically plausible, but must be tested empirically.
1 . Coskren PJ, Luebke JI, Kabaso D, Wearne SL, Yadav A, Rumbell T, Hof PR, Weaver CM (2015) Functional consequences of age-related morphologic changes to pyramidal neurons of the rhesus monkey prefrontal cortex. J Comput Neurosci 38:263-83 [PubMed]
Model Information (Click on a link to find other models with that property)
Model Type: Neuron or other electrically excitable cell;
Brain Region(s)/Organism:
Cell Type(s): Neocortex L2/3 pyramidal GLU cell;
Channel(s): I Na,t; I A; I K; I M; I h; I K,Ca; I Calcium; I_AHP;
Gap Junctions:
Simulation Environment: NEURON;
Model Concept(s): Influence of Dendritic Geometry; Detailed Neuronal Models; Action Potentials; Aging/Alzheimer`s;
Implementer(s): Weaver, Christina [christina.weaver at fandm.edu];
Search NeuronDB for information about:  Neocortex L2/3 pyramidal GLU cell; I Na,t; I A; I K; I M; I h; I K,Ca; I Calcium; I_AHP;
This directory was written by Patrick J Coskren, pcoskren@icloud.com .  
Last modified August 2014.

This is the project directory for the second paper with the Hof
lab.  A particular goal is being able to easily regenerate the papers 
computations in a turnkey fashion.

To run, execute the file Scripts/regeneratePaperComputations.sh from the same
directory containing this README.TXT file, like so:

bash ./Scripts/regeneratePaperComputations.sh

While this is a little awkward to invoke, it avoids a lot of "../" paths inside
the script, which I considered to be the larger gain.

Scripts/ParameterSets.csv contains a list of named parameter sets which are
selected in the first non-comment line of regeneratePaperComputations.sh.  You
can modify it there to choose to run all the simulations with a different set
of parameters.  All the .hoc scripts have been modified to get their parameters
from here.

The regeneratePaperComputations.sh assumes your computer has a copy of GNU
parallel, an extremely handy utility that takes a set of commands and splits
them across multiple CPU cores, collating the results for you.
<http://www.gnu.org/software/parallel/>  For computations like the ones here,
that are largely independent of each other and require little RAM or disk space,
this gives a nearly linear speedup with the number of cores.  On some systems,
you may want to add an extra argument to the calls to parallel that sets the
number of cores manually: this is because some Intel chips provide
"hyperthreading", which looks to the operating system like two CPU cores for
every single CPU core that's present.  This is handy when you're running lots of
jobs that aren't CPU-bound (which is the typical case in most computers), but
when the CPU is the limiting factor, hyperthreading just adds overhead, and
you're better off manually setting the number of cores to what you really have.
See 'man parallel' for details.  The extra overhead isn't that much, so if
you're not sure whether your CPU uses hyperthreading, there's no harm in just
not worrying about it.

Another goal of this code re-organization was to move all computation (stats,
generally) from Excel to R; not because I have a problem with spreadsheets per
se, but because I found they made it difficult to audit the formulas to be
certain they were doing what I thought they were.  Moving all the data tables to
.CSV and all the computations to R provides an extra degree of transparency and

The code in Scripts/NeuronMechanims will need to be rebuilt with nrnivmodl on
the host system.

The only purpose of NumericalResults-Baseline was to provide something to diff
against when I modified the scripts in a way that was intended just to clean up
code, just to make sure that I didn't break something in the process.

Loading data, please wait...