Decomposing Local Probability Distributions in Bayesian Networks for Improved Inference and Parameter Learning

Adam Zagorecki, Mark Voortman, Marek J. Druzdzel

A major difficulty in building Bayesian network models is the size of conditional probability tables, which grow exponentially in the number of parents. One way of dealing with this problem is through parametric conditional probability distributions that usually require only a linear number of parameters in the number of parents. In this paper we introduce a new class of parametric models, the pICI models, that aim at lowering the number of parameters required to specify local probability distributions, but are still capable of modeling a variety of interactions. A subset of the pICI models are decomposable and this leads to significantly faster inference as compared to models that cannot be decomposed. We also show that the pICI models are especially useful for parameter learning from small data sets and this leads to higher accuracy than learning CPTs.

Subjects: 3.4 Probabilistic Reasoning

Submitted: Feb 13, 2006


This page is copyrighted by AAAI. All rights reserved. Your use of this site constitutes acceptance of all of AAAI's terms and conditions and privacy policy.