Computational principles of multisensory perception and decision making

By treating perception and decision making as a statistical inference processes we can describe individual steps and processes in the sensation to action cycle using computational models. This allows us to test specific hypothesis as to whether, when and how sensory information is integrated to yield a more robust and reliable sensory percept. Furthermore, this allows us to understand how contextual factors such as prior expectation, prior sensory experience and perceptual learning affect sensory integration.

Combination of acoustic and visual speech

Speech offers a prime example of multisensory perception. Previous work has suggested that rhythmic brain activity plays a pivotal role in speech perception and our work focuses on understanding the influence of visual speech signals on the encoding of acoustic speech in rhythmic brain activity.

Single trial analysis of electroencephalographic brain activity

EEG (electroencephalography) offers a window onto the electric neural activity occurring in the brain. While many studies typically study trial-averaged EEG activity, so called evoked potentials, it has become clear and single-trial EEG signals offer a much more powerful and insightful window onto the neural processes underlying cognition. We continuously refine methods for the statistical analysis of single trial EEG signals and how they relate to sensory perception or cognitive processes.

Role of rhythmic brain activity for perception and decision making

An important mission of our work is to understand the specific role of rhythmic brain activity in perception and decision making. Rhythmic brain activity provides an index of the momentary excitability of neural populations and our previous work has shown that the state of rhythmic brain activity prior to the emerge of a sensory stimulus can shape whether and how this is perceived. Ongoing work seeks to understand which specific computations in the perception to action cycle relate to which specific patterns of rhythmic brain activity.

Multisensory integration of spatial cues

Our navigation system – the inner GPS that helps us to find our way – integrates cues from multiple sources. We use objects in the environment as landmarks and also internal signals from the motor system and proprioception, providing information about body position and orientation in space.

We apply psychophysical methods and virtual reality techniques to find answers to the following questions: How do humans employ and combine different internal and external cues for homing? Is the quality of single sources considered and weighted for multisensory information processing? What is the origin of navigational errors? How do we deal with incomplete and erroneous navigational cues?