Circuits that contain the Cell : Abstract integrate-and-fire neuron

(Like the simple electrical model of a neuron first introduced in (Lapicque 1907), only without the leak current.)
Re-display model names without descriptions
    Models   Description
1. A computational model of systems memory consolidation and reconsolidation (Helfer & Shultz 2019)
A neural-network framework for modeling systems memory consolidation and reconsolidation.
2. A NN with synaptic depression for testing the effects of connectivity on dynamics (Jacob et al 2019)
Here we used a 10,000 neuron model. The neurons are a mixture of excitatory and inhibitory integrate-and-fire neurons connected with synapses that exhibit synaptic depression. Three different connectivity paradigms were tested to look for spontaneous transition between interictal spiking and seizure: uniform, small-world network, and scale-free. All three model types are included here.
3. First-Spike-Based Visual Categorization Using Reward-Modulated STDP (Mozafari et al. 2018)
"...Here, for the first time, we show that (Reinforcement Learning) RL can be used efficiently to train a spiking neural network (SNN) to perform object recognition in natural images without using an external classifier. We used a feedforward convolutional SNN and a temporal coding scheme where the most strongly activated neurons fire first, while less activated ones fire later, or not at all. In the highest layers, each neuron was assigned to an object category, and it was assumed that the stimulus category was the category of the first neuron to fire. ..."
4. Growth Rules for Repair of Asynch Irregular Networks after Peripheral Lesions (Sinha et al 2021)
A model of peripheral lesions and the resulting activity-dependent rewiring in a simplified balanced cortical network model that exhibits biologically realistic Asynchronous Irregular (AI) activity, used to derive activity dependent growth rules for different synaptic elements: dendritic and axonal.
5. Hebbian STDP for modelling the emergence of disparity selectivity (Chauhan et al 2018)
This code shows how Hebbian learning mediated by STDP mechanisms could explain the emergence of disparity selectivity in the early visual system. This upload is a snapshot of the code at the time of acceptance of the paper. For a link to a soon-to-come git repository, consult the author's website: . The datasets used in the paper are not provided due to size, but download links and expected directory-structures are. The user can (and is strongly encouraged to) experiment with their own dataset. Let me know if you find something interesting! Finally, I am very keen on a redesign/restructure/adaptation of the code to more applied problems in AI and robotics (or any other field where a spiking non-linear approach makes sense). If you have a serious proposal, don't hesitate to contact me [research AT tusharchauhan DOT com ].
6. Neurogenesis in the olfactory bulb controlled by top-down input (Adams et al 2018)
This code implements a model for adult neurogenesis of granule cells in the olfactory system. The granule cells receive sensory input via the mitral cells and top-down input from a cortical area. That cortical area also receives olfactory input from the mitral cells as well as contextual input. This plasticity leads to a network structure consisting of bidirectional connections between bulbar and cortical odor representations. The top-down input enhances stimulus discrimination based on contextual input.
7. Potjans-Diesmann cortical microcircuit model in NetPyNE (Romaro et al 2021)
The Potjans-Diesmann cortical microcircuit model is a widely used model originally implemented in NEST. Here, we re-implemented the model using NetPyNE, a high-level Python interface to the NEURON simulator, and reproduced the findings of the original publication. We also implemented a method for rescaling the network size which preserves first and second order statistics, building on existing work on network theory. The new implementation enables using more detailed neuron models with multicompartment morphologies and multiple biophysically realistic channels. This opens the model to new research, including the study of dendritic processing, the influence of individual channel parameters, and generally multiscale interactions in the network. The rescaling method provides flexibility to increase or decrease the network size if required when running these more realistic simulations. Finally, NetPyNE facilitates modifying or extending the model using its declarative language; optimizing model parameters; running efficient large-scale parallelized simulations; and analyzing the model through built-in methods, including local field potential calculation and information flow measures.

Re-display model names without descriptions