Proceedings:
No. 1: Agents, AI in Art and Entertainment, Knowledge Representation, and Learning
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 13
Track:
Knowledge Bases & Context
Downloads:
Abstract:
Truth maintenance systems provide caches of beliefs and inferences that support explanations and search. Traditionally, the cost of using a TMS is monotonic growth in the size of this cache. In some applications this cost is too high; for example, intelligent learning environments may require students to explore many alternatives, which leads to unacceptable performance. This paper describes an algorithm for fact garbage collection that retains the explanation-generating capabilities of a TMS while eliminating the increased storage overhead. We describe the application context that motivated this work and the properties of applications that benefit from this technique. We present the algorithm, showing how to balance the tradeoff between maintaining a useful cache and reclaiming storage, and analyze its complexity. We demonstrate that this algorithm can eliminate monotonic storage growth, thus making it more practical to field large-scale TMS-based systems.
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 13
ISBN 978-0-262-51091-2