Computational Learning Theory: A Bayesian Perspective

Tom Chavez

This paper introduces a reformulation of the model which solves the preceding difficulties. We have been investigating a model of learning similar to COLT in its demands for efficiency and distributionindependence, yet one that also possesses the attractive features of Bayesianism, such as prior beliefs, continuous, incremental updating, and subjective probability. Our approach covers the classes of learnable concepts covered by COLT, as well as other categories of facts and concepts that it does not currently support. The new methodology possesses other distinct advantages: mistaken prior beliefs wash out quickly; the exactitude of inference over individual attributes can be specified in advance; and the efficiency of learning does not depend on the size of the concept space.

This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.