The Neuroinformatics Group, led by Prof. Dr. Helge Ritter, strives for a deeper understanding of the required interplay of adaptive control, embodiment, knowledge, and learning to enable cognitive interaction for robots or intelligent interfaces.
In contrast to today's engineered systems, that rely on an almost complete specification of their numerous details, neural systems are shaped to a large extent by self-organization, adaptation and learning and thus can flexibly and robustly adapt to new situations.
To gain insights into the working principles of these systems and to replicate similar functions in technology, the NeuroInformatics Group brings together methods from neural networks, machine learning, computer vision, dynamical systems and control, embracing topics such as data mining, brain-machine interfaces, evolutionary computation and complex systems integration, and drawing cross-disciplinary inspiration from brain science, psychology and linguistics. Due to the intimate connection between neural networks and the control of behavior, robotics is considered the major test bed for algorithms and computational models. Here, the focus is particularly on manual intelligence and its replication for articulated robot hands. Utilizing eye-, motion-, and force-tracking systems, and in strong cooperation with the Biomechanics Group (Thomas Schack), we study human manipulation actions and subsequently turn our insights into appropriate robot control strategies, utilizing tight tactile- and vision-driven feedback loops to achieve robust behaviour. In cooperation with the groups of Applied Informatics (Britta Wrede), Semantic Computing (Philipp Cimiano), Applied Computational Linguistics (David Schlangen), and Social Cognitive Systems (Stefan Kopp) we are working towards robot systems that can be teached in an interactive manner and that acquire new knowledge about the world in an incremental fashion.