How can people understand the decisions made by machines? What do the algorithmic approaches say? How can artificial intelligence (AI) become comprehensible?
Often, technical explanations assume knowledge about the functioning of AI and are difficult to comprehend. In the Collaborative Research Center/Transregio Constructing Explainability, researchers are working on ways to involve users in the explanation process.
For this purpose, the interdisciplinary research team examines the principles, mechanisms, and social practices of explaining and how they can be considered in the design of AI systems. The goal of the project is to make explanation processes understandable and create comprehensible assistance systems.
A total of 22 project leaders, along with around 40 research assistants from the fields of linguistics, psychology, media studies, sociology, economics, and computer science at the Universities of Bielefeld and Paderborn, are investigating the co-construction of explanations.