Experiments with behaving animals provide ample evidence for the modulation of neuronal response properties on several time-scales ranging from tens of milliseconds to minutes. Combining computational models of neuronal networks with concepts and techniques from the dynamical systems, statistical physics and machine learning fields we are interested in the mechanisms which underly changes of neural excitability, adaptation and synaptic plasticity, and their roles for computation, maintaining sensory representations and stabilizing physiological network states.
Recently we have investigated the spiking activity and synchronization properties of adaptive model neurons by applying and extending mean-field methods, a phase-reduction technique and the master stability function framework. We have characterized the effects of different types of adaptation currents -- which in real cortex are under top-down control of the brain’s neuromodulatory systems -- on spike train statistics and their potential for stabilizing spike synchrony, phase locking, and cluster states in networks. Furthermore, we have analyzed the impact of adaptation currents on spike rate oscillations in large networks, where, for example, noisy external inputs only allow for "sparse synchrony".
Another important topic in this context is the role of noise in sensory processing, and whether noise has to be considered beneficial or detrimental. For that purpose we have developed new ways to quantify noise correlations. These methods are used to analyze dependency structures within populations of neurons that have long been ignored.
Acknowledgements: Research was funded by BMBF, DFG and the Technische Universität Berlin.