Almost everyone knows at least one of the following situations: you want to bake a cake but cannot remember the exact recipe, you do not know how to use a high-tech kitchen stove to prepare a delicious meal or simply how to repair a bicycle. In such everyday situations it may be helpful to receive unobtrusive and intuitive support from an adaptive technical system that operates along in a largely unnoticed and restriction-free manner. This project focuses on the development and testing of intelligent glasses. It combines techniques from memory research, eye tracking and vital parameter measurement (such as pulse or heart rate), object and action recognition (Computer Vision), as well as Augmented Reality (AR) with modern diagnostics and corrective intervention techniques. The system will be able to identify problems in actual action processes, to react when mistakes are made, as well as to display situation and context dependent assistance in textual, visual or avatar based format superimposed on a transparent virtual plane in users' field of view. ADAMAAS aims to provide support for people to be able to live a self-sufficient life in an age appropriate way according to their mental and physical capabilities. Thus, the system will be able to suggest new action options and to induce selective learning processes. ADAMAAS is therefore not a stationary, but a mobile assistance system.