KSL-93-01

Additive Belief-Network Models

Reference: Dagum, P. & Galper, A. Additive Belief-Network Models. Washington, D.C, 1993.

Abstract: The intractability of available probabilistic inference algorithms hinders belief network applications to large domains. Researchers have shown that both exact and approximate probabilistic inference is NP-hard, and therefore, we do not hope to find tractable solutions to inference in large applications. The intractability of inference, known implicitly to designers of large applications, and the formal proofs of its complexity that came afterwards, together motivated alternative research directions in hopes of tractable solutions to the impasse. From this work arose, for example, noisy OR-gates used in QMR-DT and probabilistic similarity networks. Motivated by recent developments in belief network models for time-series analysis and forecasting, we define "additive belief network models" (ABNM). We (1) discuss the nature and implications of the approximations made by an additive decomposition of a belief network, (2) prove greater efficiency in the induction of additive models when available data is scarce, (3) generalize the Lauritzen-Spiegelhalter inference algorithm to exploit the additive decomposition of ABNMs (4) prove greater efficiency of inference, and (5) present implementation results on induction and on inference of belief networks.


Jump to... [KSL] [SMI] [Reports by Author] [Reports by KSL Number] [Reports by Year]
Send mail to: ksl-info@ksl.stanford.edu to send a message to the maintainer of the KSL Reports.