skip to main contentskip to main menuskip to footer Universität Bielefeld Play
  • Neurocognition and ­Action - Biomechanics

    © Universität Bielefeld

Mobile Action Assistance Lab

Zum Hauptinhalt der Sektion wechseln


Alexander Neumann

Benjamin Strenge

Dr. rer. nat. Benjamin Strenge

+49 521 106-4628
Telephone secretary
+49 521 106-6991
A person wearing AR glasses in front of a coffee machine
© NCA-Group - Universität Bielefeld

In the Mobile Action Assistance Lab we combine different mobile techniques, like eye tracking, measurement of mental representation structures, wearable sensor technology, AR (Augmented Reality) -- and Electroencephalography (EEG) techniques with modern diagnostics and corrective intervention techniques. The equipment consists of high cost devices, such as binocular eye tracker, as well as low-cost solutions, such as the Kinect. We capture multi-modal data of people acting in everyday situations, such as assembling a device, doing some sport exercises, or learning new skills. By using adaptive algorithms, such as machine and deep learning, we are able to design mobile cognitive devices which are able to identify problems in actual action processes, to react when mistakes are made, as well as to provide situation and context dependent assistance in auditive, textual, visual or avatar based format, according to people's mental and physical capabilities. The overall aim is to develop mobile cognitive assistive systems which are able to personally adapt to the particular user and action context and to provide individual action assistance in an unobtrusive way.

A person wearing AR glasses assembles a birdhouse with a screwdriver
© NCA-Group - Universität Bielefeld

Action support for an assembly task. Left: The user is wearing an AR-Eyetracking Glass while assembling LEGO parts. Right: Situation and context dependent assistance is displayed on a transparent virtual plane in users' field of view.

back to top