On Repairing Reasoning Reversals Via Representational Refinements

Alan Bundy, Fiona McNeill, Chris Walton

Representation is a fluent. A mismatch between the real world and an agent's representation of it can be signalled by unexpected failures (or successes) of the agent's reasoning. The `real world' may include the ontologies of other agents. Such mismatches can be repaired by refining or abstracting an agent's ontology. These refinements or abstractions may not be limited to changes of belief, but may also change the signature of the agent's ontology. We describe the implementation and successful evaluation of these ideas in the ORS system. ORS diagnoses failures in plan execution and then repairs the faulty ontologies. Our automated approach to dynamic ontology repair has been designed specifically to address real issues in multi-agent systems, for instance, as envisaged in the Semantic Web.

Subjects: 11.2 Ontologies; 1.5 Diagnosis

Submitted: Feb 10, 2006


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.