zum Hauptinhalt wechseln zum Hauptmenü wechseln zum Fußbereich wechseln Universität Bielefeld Play Search
  • Neurocognition and ­Action - Biomechanics

    © Universität Bielefeld

Research Projects


EU Projects

  • A person offers a humanoid robot a red stacking cup to grasp
    © Universität Bielefeld

    AMARSi

    Adaptive and Modulare Architecture of Rich Motor Skills

  • photo of a neuromorphic iCub robot
    © Universität Bielefeld

    PRIMI

    Performance in Robots Interaction via Mental Imagery

    PRIMI’s ambition is to induce a paradigm shift in AI and robotics to create truly autonomous socially interactive robots, which will offer new technological perspectives for transforming personal robotic services.

CITEC Projects

  • An avatar gives instructions to a person to perform knee bends
    © CITEC - Universität Bielefeld

    ICSpace

    Intelligent Coaching Space

    In this project, we explore how to most effectively support humans during the performance and the learning of motor actions.

  • A hand moves a weight from one contact plate to another according to instructions on a screen
    © NCA-Group - Universität Bielefeld

    FAMULA

    Deep Familiarization and Learning Grounded in Cooperative Manual Action and Language

    The Large scale project FAMULA focuses on a robot's autonomous familiarization with novel objects and their affordances.

  • A person wearing an EEG cap in front of a screen and a keyboard
    © NCA-Group - Universität Bielefeld

    Brain machine interfaces

    Brain machine interfaces to improve human interactions using resource efficieny and adaptivity

    How can multimodal brain-machine interfaces be applied to improve human machine interactions qualitatively?

  • A ball labyrinth and its virtualization
    © CITEC - Universität Bielefeld

    Single and Dyadic Visuo-Haptic Learning

    This project introduces a novel physical, visuo-haptic and bi-manual, maze task to investigate the question of how humans acquire a newmanual skill.

  • A person wearing an EEG cap sitting in front of a table with a screen and touch buttons
    © NCA-Group - Universität Bielefeld

    Planning and executing manual actions

    Cognitive control and neurophysiological bases of planning and executing manual actions

    This project's main purpose is to provide a better understanding of the (neuro)-functional relationship of cognitive functions and action.

DFG Projects

  • A person with data glasses in front of a coffee machine
    © NCA-Group - Universität Bielefeld

    DEMAPP

    Design and Evaluation of a Modular Adaptive Cognitive Mobile Assistive System for Flexible Use in Professional and Private Environments

    DFG-Grant to Support the Initiation of International Collaboration.

  • Two people, one with data glasses, play chess
    © NCA-Group - Universität Bielefeld

    CEEGE

    Chess Expertise from Eye Gaze and Emotions

    The aim of this project is to experimentally evaluate and compare current theories for mental modeling for problem solving and attention.

  • An avatar in front of a virtual shelf with drawers below a screen
    © Christoph Schütz - Universität Bielefeld

    CogMech

    Cognitive mechanisms of motor planning - how cognitive and mechanical costs of a motor task affect the fractions of motor plan reuse and novel planning

BMBF/ESF Projects

  • A person mounts a workpiece according to instructions via data glasses
    © NCA-Group - Universität Bielefeld

    AVIKOM

    A cognitive and mobile assistance system to provide individualized audiovisual action support in the modern working world

    Intelligent glasses initially developed in the ADAMAAS project will be enhanced by an acoustic system and adapted for industrial application to reduce the burden on employees and increase productivity.

  • A person with data glasses mixes a dough according to instructions on the AR display
    © NCA-Group - Universität Bielefeld

    ADAMAAS

    Adaptive and Mobile Action Assistance in Daily Living Activities - adaptive technical systems - for an intuitive interaction between humans and complex technologies

    This project focuses on the development and testing of intelligent glasses. It combines techniques from memory research ...

  • A seated person imitates the movements of an avatar on a screen
    © Universität Bielefeld

    KogniHome

    Networked Living - The Smart Apartment (KogniHome)

    The research goal of KogniHome is to develop and implement technology that improves the everyday live quality of inhabitants of a smart home.

  • A standing person with dumbbells imitates the movements of an avatar on a screen
    © Universität Bielefeld

    Personal Coach

    KogniHome Subproject III

    The supportive personal coach project strives to develop an individualized, adaptive health-training utilizing modern sensor technology and psychological methods.

EFRE Projects

  • Two virtualized hands above a virtual dining table with arranged plates
    © NCA-Group - Universität Bielefeld

    Vecury

    Digital Motor Rehabilitation

    The Vecury project investigates how virtual reality in combination with 3D motion tracking can support patients and therapists during physical rehabilitation.

Start-up Funds Medicine

  • Adaptiv virtuelle Rehabilitation

    Adaptiv virtuelle Rehabilitation bei Verletzungen der oberen Extremität

    This project examines the experiences of patients and therapists during the introduction of an individualized and adaptive virtual-reality rehabilitation protocol for the upper extremities.

von Bodelschwingh Foundation (Bethel) Projects

BISP Projects

  • Photo of the female youth volleyball national team
    © NCA-Group - Universität Bielefeld

    MentPower

    Psychological support and consultancy of the Germany's female Youth Volleyball National Team on its way to the European and World Championship.

    The main objective of the present project was the psychological support and supervision of the German female volleyball national team (under 18) on their way to the European and World championship.

Volkswagen Foundation

  • Two dancers
    © NCA-Group - Universität Bielefeld

    Motion Together

    This Art and Science project focuses on entrainment in dance, investigating complex processes of interaction, coordination and synchronization from a multidisciplinary perspective.

DAAD Projects

Forschungsfonds Medizin

  • A table tennis player inside a motion capture environment
    © NCA-Group - Universität Bielefeld

    Forschungsfonds Medizin

    This project aims to understand the influence of physical activity on the progress and outcome of standard depression therapy and to understand the underlying changes in (brain) neuroplasticity.

Gradschool Projects "Cognitive Interaction Technology"

Bielefelder Nachwuchsfonds

The effect of sensorimotor rhythm neurofeedback on sport performance
In my mind's eye: Exploring the similarities in motor planning between executing, imagining and observing an action
Wie man Aepfel mit Birnen vergleicht

Further Projects

Previous Projects

This project investigates the impact of movement expertise on visual perception of the environment when we observe objects, interactions or events. Our focus is on the specific interplay between eye movement parameters and the quality of mental representations during the selective perception of action-relevant information. By measuring visual attention patterns of experts and novices, we analyze how different levels of movement expertise affect the relative influence of top-down and bottom-up mechanisms during the perception of complex actions. We hope that the resulting insights can contribute to better informed designs of human-machine interfaces, humanoid robots and intelligent systems.

Instructions about sequences of actions are better memorized when they are offered with appropriate gestures. In this project, the virtual human MAX will accompany instructions with self-generated gestures, which leads to memory representations in the human listener. The quality of these representations will then be assessed by MAX (Split-Method), which provides for a measure of listener's comprehension, and which can in turn be used by MAX as a basis for the adjustment of its future use of particular instructions and gestures in a closed loop scenario.

The goal of this project is to create an incrementally growing database of manual interactions to help put manual intelligence research on a firmer empirical basis. In order to populate such a database necessitates the study of manual interactions in humans. The database should be populated with multimodal information: geometry information, tactile sensor information, vision information and sound information. Using these multimodal information sources will allow models to be built that represent manual interaction. The database can then be used to aid robots carry out complex tasks of the type that humans perform with ease. In order to decide on the structure of the database involves the answering of several important scientific questions: How should manual interactions be represented for storage, comparison and retrieval? What are suitable similarity measures for manual interactions? What are the elementary building blocks of a manual interaction? How do manual interactions motivated on the perceptual, control and task levels differ? Solving these questions will involve using skills in both psychology and computer science.

 

This project develops and establishes AcouMotion as a method for research in cognitive science and as novel tool for interaction technology. AcouMotion is a hard-/software system for combining human body motion, tangible interfaces and sonification to a closed-loop human computer interface that allows non-visual motor control by using sonification (nonspeech auditory displays) as major feedback channel. AcouMotion's main components are sensors to measure motion parameters, a computer simulation to represent the dynamical evolution of a model world, and sonification to render a real-time auditory representation of objects and any interactions in the model world. The applications of AcouMotion range from new vision-free sport games over physiotherapy to cognitive research.

The development of appropriate representations for manual action episodes and their coordination to more complex action sequences still poses a major challenge for robotics. It is a natural approach to gain insight to these topics from a cognitive and experimental perspective revealing the development of sensorimotor representations in humans accomplishing those complex tasks. Especially, it is interesting to learn about the relationship between the structure of representations and the performance in the execution of sensorimotor tasks, like sequential object manipulations. Hence, one of the most interesting questions is what kind of representation helps people to control actions. This project focuses on the question of how structures of sensorimotor representations are established and gradually changed, while considering the physical properties of objects, task constraints (affordances) and perceptual discordances in sensorimotor adaptation tasks.

A remarkable ability humans have, is to organize and discretize perceived sensory input and knowledge about the world into categories and concepts. This way of knowledge representation is an essential part of efficient communication as well as choosing appropriate actions in novel situations. The goal of this project is to investigate ways how the formation of concepts and categories could be artificially created and probably implemented on real robots. Robots endowed with such a mechanism should be able to incrementally increase their knowledge about the world thus enlarging their opportunities to act in the world.

What insights can we gain from psychological measurements of biomechanical parameters and subjective judgments of manual actions (like object grasping) about the structures of the underlying cognitive representations? In this project, we will bring together statistical methods (like structure dimensional and principal analysis) with connectionist approaches employing artificial neural networks to test different hypotheses about the cognitive structure of manual actions. A major goal will be to emulate and control grasping behavior for a broad range of objects in kinematic simulations and - as a longer term objective - in real physics on a robot platform.

Within this project we investigate the development of motor skills by explorative learning based on the control basis framework of Grupen et al, which allows a flexible combination of simple control units to achieve complex control tasks. One major aspect of the proposed project will be the basic implementation of the control basis framework within the 2-Arm Manipulation-Setup and for the Barthoc torsos. This includes to choose the right set of elementary controller units, and to investigate the resulting behavior if they are combined in various ways. Another aspect will be to develop coordination mechanisms required to execute and monitor several primitives in sequence or in parallel. Here the question arises how to prioritize several conflicting behaviors that operate in parallel. Which mechanisms are needed to dynamically weight primitives depending on situational context, perceptual confidence of different sensor modalities, and user-specified task priority? If elementary controllers are combined in a hierarchical manner, how can their resource requirements be automatically propagated through this hierarchy to provide a foundation for scheduling? Finally, the most challenging aspect is the development of learning methods, which allow the robot to acquire basic motor behavior from scratch by exploration, eventually primed from visually or otherwise observed trajectories. Especially interesting in this context are of course manipulative movements, e.g. pushing an object with a single finger or re-grasping an object in order to improve the grasp quality or to realize a desired object orientation w.r.t. the hand. Research Areas

Representations of spatial relations play an important role for joint attention, joint action, cooperation and competition between humans as well as between humans and robotic systems. Spatial representations in human-human or human-robot interaction are in many cases coordinated via language. In this project we want to apply psycholinguistic and actionrelated behavioural paradigms to corroborate the position that the type of viewing point used might be highly relevant for the characterization of a reference frame because different viewing points may be based on different (partly modality-dependent) types of perceptual information.

One central issue for the cognitive control of movement is the compensation of errors and learning processes that enhance error compensation mechanisms. This is especially true for very precise movements such as many manual actions. The present project combines methods and conventional experimental settings (first order reality) with approaches from Virtual Reality and Augmented Reality to embed subjects in interaction loops in which the occurrence and perception of errors can be manipulated and studied in novel ways. In this way we hope to gain new clues about error correction mechanisms, error compensation learning and their replication in technical systems such as robots.

Speed Stacking is a bimanual scenario that consists of upstacking and downstacking pyramids of plastic cups in pre-determined sequences as quickly as possible. Since learners usually improve significantly during short periods of time, speed stacking is particularly well suited to investigate motor skill learning and automatization. This rapid improvement and additional constant feedback about errors and stacking duration renders the task very motivating. We will analyze how attentional guidance and motion paths change during the learning process and how two individuals align their action timing. Research will be based on the processbased measures motion path and gaze pattern, accompanied by product-based measures like task duration and error rate.

The aim of this project is to investigate complex movements of the human body in sports, dance or every-day life activities, and to understand how these movements are controlled, learned and reproduced under changing conditions. By means of biomechanical and psychological methods, we will investigate physical, physiological and cognitive aspects of movement performance in order to understand how the different systems play together to control full body movements in an adaptive way. Based on our experimental results, we want to develop a simulation framework that integrates the different control systems and their interaction, including a high level control network that generates and adapts sequences of motor primitives representing control on the level of mental representation.


Zum Seitenanfang