Track:
Contents
Downloads:
Abstract:
We have developed a model-based, distributed architecture that integrates diverse components in a system designed for lunar and planetary surface operations: an astronaut’s space suit, cameras, rover/All-Terrain Vehicle (ATV), robotic assistant, other personnel in a local habitat, and a remote mission support team (with time delay). Software processes, called "agents," implemented in the Brahms language (Clancey, et al. 1998; Sierhuis 2001), run on multiple, mobile platforms. These "mobile agents" interpret and transform available data to help people and robotic systems coordinate their actions to make operations more safe and efficient. The Brahms-based mobile agent architecture (MAA) uses a novel combination of agent types so the software agents may understand and facilitate communications between people and between system components. A state-of-the-art spoken dialogue interface is integrated with Brahms models, supporting a speech-driven field observation record and rover command system (e.g., "return here later and bring this back to the habitat"). This combination of agents, rover, and model-based spoken dialogue interface constitutes a "personal assistant." An important aspect of the methodology involves first simulating the entire system in Brahms, then configuring the agents into a run-time system