ikeda
Tom Tetzlaff

Inst. of Neuroscience and Medicine (INM-6)
Computational and Systems Neuroscience

Research Center Jülich


 

Index
Research

Publications
Curriculum vitae
Teaching

Theoretical Neuroscience: Correlation structure of neuronal networks

(summer term 2013)

Description:

advanced course for
  • MS Physics (minor-field module "Biophysics")
  • PhD students in Computational Neuroscience
This lecture is part of the module "Biophysics" of the physics master program, but it is open to students at the bachelor level and students from other domains as well. After an overview and introduction into the field we will systematically develop the theory of fluctuations and correlations in neuronal networks. This active topic of current research addresses the foundations of many aspects of network function like plasticity and learning and will serve us to introduce the prominent neuronal network models and theoretical tools presently in use.

Starting from linear Ornstein-Uhlenbeck processes (Langevin equations), over the classical binary (spin) network model to the spiking leaky integrate-and-fire model, the de-facto standard of contemporary computational neuroscience, we will develop the methods required to gain an analytical understanding of network dynamics. The sequence of models from low to high realism and corresponding analytical complexity will allow us to first clarify the concepts, and subsequently treat the analytically more challenging topics.

The student will get acquainted with elementary methods from linear system's theory, Fourier methods, point processes, as well as methods from statistical mechanics (Markov processes, master equation, Chapman-Kolmogorov equation, noise and diffusion processes, Fokker-Planck equation and non-equilibrium steady states) as they are applied in theoretical neuroscience.

The neuroscientific topics will include the mean-field theory for binary and spiking networks, the balanced state in non-spiking and spiking networks, the mechanisms of decorrelation and of oscillations in recurrent cortical networks, the classical mean-field theory of pairwise correlations in binary (spin) networks, the corresponding state-of-the-art theory for spiking networks, including the temporal structure of correlations. We will end with an outlook on probabilistic inference and non-equilibrium properties of neuronal networks.

The weekly lecture (45 minutes) is accompanied by exercises (90 minutes) to discuss the homework. Homeworks will consist of a mixture of theoretical exercises and (simple) programming and simulation exercises (preferably using python/numpy/scipy/matplotlib) to deepen the topics of the lecture.

Contents:

1. Course introduction
(Markus Diesmann, Moritz Helias, Tom Tetzlaff; lecture slides)
  • What is theoretical Neuroscience?
  • Why cortex?
  • Why correlations?
  • Properties of cortical networks
Exercise
(Tom Tetzlaff; exercise sheet)
  • Impulse response and transfer function of the leaky integrate-and-fire (LIF) neuron
2. Pairwise correlations in spike trains
(Tom Tetzlaff; lecture script)
  • pairwise correlations in time and frequency domain
  • spike-train correlations
  • shot-noise correlations
  • correlations in population signals
Exercise
(Tom Tetzlaff; exercise sheet)
  • Wiener-Khinchin theorem
  • coherence between two deterministic signals
  • auto-correlation of a stationary Poisson process
3. Theory of correlations in linear rate models
(Moritz Helias)
  • rate modulated Poisson processes
  • equivalent fluctuating rate dynamics
  • definition of rate models
  • solution of the rate dynamics with output noise
  • population-averaged covariances
  • explicit calculation of the cross covariances
  • back-transform to time domain
Exercise
(Moritz Helias)
  • Ornstein-Uhlenbeck process
  • Dichotomous and Gaussian white noise
  • population average of uncorrelated noise
  • explicit form of the cross spectrum for an E-I network
  • explicit form in time domain
4. Mean-field theory of binary networks
(Moritz Helias)
  • master equation for binary neurons
  • time evolution for the first moment
  • mean-field solution
  • stability and response of the mean activity
Exercise: Attractor network
(Moritz Helias)
  • ground state without synaptic fluctuations
  • ground state with synaptic fluctuations
  • numerical check
  • embedding a cell-assembly
  • local stability of the ground state
  • appearance of the second attractor
5. Theory of pairwise correlations in binary networks
(Moritz Helias)
  • master equation for the joint probability distribution, Markov property
  • single-neuron susceptibility and linearized equation for correlations
  • example E-I network
  • suppression of fluctuations and correlations
  • temporal structure of correlations
  • equivalence of binary neurons and linear rate model
Exercise: Correlations in binary networks
(Moritz Helias)
  • Correlation caused by a single synapse
  • fluctuations in an inhibitory network
  • network susceptibility
6. Time-resolved covariance functions for binary neurons
(Moritz Helias)
  • equivalence of binary neurons and linear rate model
Exercise: Correlations in binary networks
(Moritz Helias)
  • projection solution of covariance matrix
  • network susceptibility
7. Spiking neurons
(Moritz Helias)
  • subthreshold dynamics
  • derivation of the Fokker-Planck equation
  • application to the leaky integrate-and-fire model
  • stationary solution
Exercise: Neuron dynamics as diffusion equations
(Moritz Helias)
  • neuron driven by Gaussian white noise   
  • the PIF model
8. Linear-response theory for spiking neurons
(Tom Tetzlaff; lecture script)
  • linear-response theory
  • effective coupling strength for the LIF model
Exercise: Linear-response theory
(Tom Tetzlaff; exercise sheet)
  • PIF model
  • integral impulse response
  • integral linear response of the LIF model
9. Decorrelation of neural-network activity by inhbitory feedback
(Tom Tetzlaff; lecture script)
  • correlation suppression in leaky-integrate-and-fire (LIF) networks
  • linearised network dynamics
  • population averaged dynamics
  • inhibitory networks
  • excitatory-inhibitory networks (Schur decomposition)
  • population averaged correlations
Exercise
(Tom Tetzlaff; exercise sheet)
  • variability of a linear decoder
  • population-averaged linear dynamics for an inhibitory random network
10. Mean-field theory and oscillations in LIF networks
 (Moritz Helias)
  • mean-field description of spiking networks
  • the LIF model: the harmonic oscillator of neuroscience
  • perturbative treatment of the time-dependent Fokker-Planck equation
  • homogeneous solution
  • boundary condition for the function values
  • flux boundary conditions (derivative)
  • transfer function
  • notes on Hermiticity

Literature:

  • Risken, The Fokker-Planck equation (excerpts), Springer
  • Ginzburg & Sompolinsky (1994), Theory of correlations in stochastic neural networks, Phys Rev E 50(4):3171–3191
  • van Vreeswijk & Sompolinsky (1998), Chaotic balanced state in a model of cortical circuits, Neural Comput 10:1321–1371
  • Brunel (2000), Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons, J Comput Neurosci 8(3):183–208
  • Renart et al. (2010), The asynchronous state in cortical cicuits, Science 327:587–590
  • Tetzlaff et al. (2012), Decorrelation of neuralnetwork activity by inhibitory feedback, PLoS Comput Biol 8(8):e1002596
  • Helias et al. (2013), Echoes in correlated neural systems, New J Phys 15:023002
  • see also references in lecture material

Additional Information:

  • SWS: 3
  • ECTS credits: 5 ECT (after passing written exam and participation in exercises [protocols, oral presentations])
  • language: English
  • prerequisites: background in mathematics equivalent to bachelor-level in physics recommended
  • location: RWTH Aachen University, department of Physics, room 26C 401
  • time: summer term 2013, Thursday's, 4am-6:30pm (starting April 11th)
  • exam: written exam at the end of the course