ZiF Research Group
Cognitive Behavior of Humans, Animals, and Machines:
Situation Model Perspectives
October 2019 – July 2020
Convenors: Werner Schneider (Bielefeld, GER), Helge Ritter (Bielefeld, GER)
Christoph Kayser studied Mathematics and Theoretical Physics at ETH Zurich, Switzerland. After working on artificial neural networks he became fascinated by the brain and obtained a PhD in Neurobiology in 2004 also from ETH. After leading a Junior Group Leader at the Max Planck Institute for Biological Cybernetics, Tuebingen, he became Chair for Integrative Neuroscience at Glasgow University, UK, 2012. Since 2017 he is Chair for Cognitive Neuroscience in Bielefeld. He serves as reviewing editor for the Journal of Neuroscience and is awarded a fellowship of the Royal Society for Biology (UK).
Current Main Research Interests
Christoph Kayser's work aims to understand how the brain combines sensory information into a coherent and unified percept: that is, the computational and physiological principles underlying sensory integration and conscious perception. Current work particularly focuses on the combination of hearing and seeing and uses time-resolved neuroimaging and behavioral studies to link specific aspects of the perception-action cascade with localized neural computations in the brain.
Five selected publications with particular relevance to the Research Group
- Cao Y., Summerfield C., Park H., Giordano B., & Kayser C. (2019). Causal inference in the multisensory brain. Neuron, 102, 500413.
- Park H. & Kayser C. (2019). Shared neural underpinnings of multisensory integration and trial-by-trial perceptual recalibration. BioRxiv, 566927. https://www.biorxiv.org/content/10.1101/566927v1
- Keitel A., Gross J., & Kayser C. (2018). Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features. PLoS Biology, 16(3), e2004473.
- Giordano B., Ince R., Gross J., Schyns P., Panzeri S., & Kayser C. (2017). Contributions of local speech encoding and functional connectivity to audio-visual speech perception. eLife, 6, e24763. doi:10.7554/eLife.24763
- Park H., Kayser C., Thut G., & Gross J. (2016). Lip movements entrain the observers' low-frequency brain oscillations to facilitate speech intelligibility. eLife, 5, e14521.