Dynamically Generated Intelligent Interfaces

R.R. Penner and E.S. Steinmetz (USA)


User interface automation, Dynamic design


The user interface (UI) provides the mechanism for humans to interact with system components and with other humans in the system. The traditional static UI development process, however, can be time consuming and may introduce errors. We believe that an automated, intelligent process can effectively overcome the problems introduced through static user interface design. We have been developing such a process for dynamic human-in-the-loop systems, and are currently evaluating its applicability to the Joint Unmanned Combat Air System (J-UCAS) program. The Automated Interaction Design Engine (AIDE) is implemented as a model-based automated reasoner that has three interdependent components. The first component generates semantically-rich models of the human, the system, and the external environment in which they are operating (a situation model). The second component productively reacts to human interaction needs (as represented in the situation model) by self-composing the interactions that are required to support these needs, and maintaining them dynamically as the situation evolves. The third component converts these interaction needs into real-time user interfaces on the devices available to the user. Our research into the applicability of AIDE to the J-UCAS program has resulted in significant advances in both the structure and content of the AIDE models. AIDE is also being used to investigate human requirements within J-UCAS, including the roles of the humans in such complex, dynamic systems, the responsibilities and information requirements to support these roles, the domain entities and relationships that must be modeled to provide a unifying situation, and the user interface structures and automated functions required to support the user. In addition, AIDE is proving useful as a quick-turnaround demonstration vehicle for other intelligent components.

Important Links:

Go Back