ZiF Research Group

Cognitive Behavior of Humans, Animals, and Machines:

Situation Model Perspectives

October 2019 – July 2020

Convenors: Werner Schneider (Bielefeld, GER), Helge Ritter (Bielefeld, GER)

Robert Hascke

Associate Fellow

Foto Neuroinformatics, Faculty of Technology, &
Center for Cognitive Interaction Technology (CITEC),
Bielefeld University, Germany
E-Mail: rhaschke@techfak.uni-bielefeld.de
Homepage


CV

Robert Haschke received the diploma and Ph.D. in Computer Science from the University of Bielefeld, Germany, in 1999 and 2004, working on the theoretical analysis of oscillating recurrent neural networks. Since then, his work focuses more on robotics, still employing neural and other learning methods whenever possible. Robert is currently heading the Robotics Group within the Neuroinformatics Group, striving to enrich the dexterous manipulation skills of our two bimanual robot setups through interactive learning. Robert was involved in EU projects WearHap and SaraFun as a work-package leader. He also served as a scientific coordinator for the Research Institute for Cognition and Robotics (CoR-Lab) and strongly engages himself in open source software, maintaining various important software projects in the robotics domain.

Current Main Research Interests

Robert's main research interest is in cognitive bimanual robotics, developing autonomous grasping and manipulation algorithms for two-handed, anthropomorphic robot hands, involving tactile and visual feedback loops. This comprises tactile sensor development, sensor processing and fusion, robot control, planning, learning, and software integration. One main focus is to integrate tactile and visual feedback for control, developing new tactile sensor hardware and corresponding processing methods to e.g. allow for surface exploration, slip detection, or grasp force adaptation. With the boom of deep learning methods, end-to-end learning of complex manipulation skills becomes a new focus as well, studying various deep reinforcement learning approaches to enable a robot to autonomously acquire grasping and manipulation skills. To this end, a key challenge, particularly in robotics, is to increase data efficiency and thus enable learning from a few examples. Applications can be found in service robotics, industrial assembly, or prosthetics.

Five selected publications with particular relevance to the Research Group
  • Görner, M., Haschke, R., Ritter, H., & Zhang, J. (2018). MoveIt! Task Constructor for task-level motion planning. Presented at the International Conference on Robotics and Automation (ICRA), Montreal.
  • Walck, G., Haschke, R., Meier, M., & Ritter, H. J. (2017). Robot self-protection by virtual actuator fatigue: Application to tendon-driven dexterous hands during grasping. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2200-2205.
  • Bianchi, M., Haschke, R., Büscher, G., Ciotti, S., Carbonaro, N., & Tognetti, A. (2016). A multi-modal sensing glove for human manual-interaction studies. Electronics, 5(3), 42.
  • Meier, M., Walck, G., Haschke, R., & Ritter, H. J. (2016). Distinguishing sliding from slipping during object pushing. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 5579-5584. doi:10.1109/IROS.2016.7759820
  • Haschke, R. (2015). Grasping and manipulation of unknown objects based on visual and tactile feedback. In G. Carbone & F. Gomez-Bravo (Eds.), Mechanisms and Machine Science: Vol. 29. Motion and Operation Planning of Robotic Systems, Switzerland: Springer. doi:10.1007/978-3-319-14705-5_4