Universität Bielefeld

© Universität Bielefeld

Research projects

Assessment of parallel movement plans by probing spatial attention


Funding: German Research Foundation (DFG)
Principal investigators: Christian Seegelke, Tobias Heed
Project members: N.N.

Our environment constantly presents us with multiple opportunities and demands for action. At any given moment, we must select one of all possible actions, and specify the corresponding movement metrics. Evidence suggests that the brain simultaneously prepares multiple actions in parallel, and action selection results from continuous competition embedded in bottom-up driven sensorimotor processing that is biased by top-down decision-relevant information.

However, there is ongoing debate about which aspects of actions are actually represented in parallel. Reaching trajectories of human participants usually reflect an average between the direct trajectories to several currently relevant targets; this aspect of motor behavior has been interpreted as indicating that the executed movement results from averaging of individual movement plans. This interpretation, thus, assumes that all aspects of the different actions are represented in parallel. However, averaging behavior appears to sometimes depend on strategic considerations. For instance, trajectory averaging is abandoned when targets are far apart, or when movements have to executed be very fast; this strategic behavior change improves participants’ overall performance. The use of such strategies has been taken to indicate that movement trajectories are not, by default, averaged. Instead, only a single detailed movement plan would be computed, and this single plan could be derived as an average when this is strategically advantageous. Thus, this theoretical stance assumes parallel representation of final movement goals, but not of movement plans that specify aspects such as the movement’s trajectory.

The central challenge to solving this debate is to find convincing measures that indicate whether several movement plans (as opposed to just goals) are currently active. We propose here a series of experiments that address this challenge. To this end, we employ the well-established relationship between movement goals and attentional deployment. Attention is shifted towards one or several sequential motor goals already prior to movement initiation, expressed in enhanced perceptual discrimination performance at target as compared to irrelevant locations. We extend this experimental approach to probing the parallel representation of multiple motor goals for hand reaches. Critically, we probe the attentional deployment to trajectory-defining locations, such as regions near obstacles, along potentially relevant movement trajectories. As a second approach, we induce multiple potentially relevant trajectories through a motor adaptation paradigm. Finally, we address top-down aspects to motor plan selection to elucidate how such aspects affect the averaging of bottom-up sensory information. Together, the proposed experiments will provide substantive and cogent evidence about which levels of movement planning underlie parallel representation.


Dynamic coding of tactile-to-motor transformation in human and macaque posterior parietal cortex    

Funding: German Research Foundation (DFG) / French National Research Agency (ANR)
Principal investigator: Tobias Heed, Suliann Ben Hamed
Project members: N.N.

Posterior parietal cortex (PPC) is a central structure for sensorimotor transformation. Yet, its contribution to planning movements towards the own body is still unclear. This project investigates the implementation of goal-directed tactile-motor processing in human and non-human primate (macaque) PPC. It aims at (i) identifying the involved parietal regions, (ii) elucidating the spatial codes used by them, and (iii) characterizing the dynamics within and between regions involved in transforming tactile information from skin to space in dependence on the involved effector executing the motor response. The three key approaches are (a) to devise homologous, directly linkable experiments across the two species and across different methods (fMRI, behavior, neurophysiology); (b) to investigate tactile behavior across two effector systems (saccades, hand reaching) to identify common and specialized processing mechanisms, and (c) to complement these common experiments by human-specific research where directly comparable paradigms are not feasible.

The project’s overarching hypotheses are that (1) common principles underlie human and macaque tactually guided motor planning across all effector systems, (2) posterior regions currently associated with eye-centered motor planning, such as macaque LIP/MIP, more generally code all sensory information in an eye-centered code, (3) anterior regions, associated with self-motion and body representation, such as macaque VIP and SPL, more generally code all sensory information in a skin or body-centered code, and (4) all regions dynamically recode spatial information from a sensory to a motor goal-related code.

The project’s aim is the extension of current visuo-motor control concepts into the tactile domain, as a first step of incorporating information on body and self in sensorimotor control, and to offer new perspectives for understanding the organizing principles underlying the functional and regional organization of PPC.


Sensorimotor processing and reference frame transformations in the human brain

Funding: Emmy Noether Programme of the German Research Foundation
Principal investigator: Tobias Heed
Project members: Janina Brandes
Previous project members: Phyllis Mania

The main aim of this project is to connect two areas of research that have usually been investigated separately: perceiving touch and making reaching movements. When we perceive a touch, we first know where it was on the skin. But because our body parts move a lot, we have to consider our body posture if we want to act on the touch.

Imagine you are sitting in a park at a picnic. You feel something crawling on your left hand. To look what is crawling, and to swipe it off, it makes a big difference whether your left hand is behind your back because you are leaning on it, or whether you are holding a plate with it in front of you. Your movements towards the touch must be very different for these two situations. At the same time, when we perform actions, like swiping away the insect on our hand, we feel our movement and monitor whether we are achieving our Goal.

Thus, both for touch perception and for reaching, body posture is a central aspect. In this project, we investigate how body posture affects touch and reaching, each on its own. Ultimately, we are interested in how the two are coordinated, that is, how the brain plans and executes movements towards the own body after it has registered a touch.


Funding: Citec (Cluster of Excellence "Cognitive Interaction Technology”), project ICSPace (“Intelligent Coaching Space")
Principle investigator: Mario Botsch, Thomas Schack, Stefan Kopp, Tobias Heed
Current project members: Marie Martel

The aim of this project is to bring together expertise from computer science, sports science, linguistics, and psychology to develop intelligent coaching approaches for sports, motor skill learning, and rehabilitation. The project is based on a virtual training environment. Our group is currently exploring possibilities to extend this research for use with children and youth.