posted on 2006-01-01, 00:00authored byOle J. Mengshoel, David C. Wilkins, Dan Roth
This article presents and analyzes algorithms that systematically generate random Bayesian networks of varying difficulty levels, with respect to inference using tree clustering. The results are relevant to research on efficient Bayesian network inference, such as computing a most probable explanation or belief updating, since they allow controlled experimentation to determine the impact of improvements to inference algorithms. The results are also relevant to research on machine learning of Bayesian networks, since they support controlled generation of a large number of data sets at a given difficulty level. Our generation algorithms, called BPART and MPART, support controlled but random construction of bipartite and ultipartite Bayesian networks. The Bayesian network parameters that we vary are the total number of nodes, degree of connectivity, the ratio of the number of non-root nodes to the number of root nodes, regularity of the underlying graph, and characteristics of the conditional probability tables. The main dependent parameter is the size of the maximal clique as generated by tree clustering. This article presents extensive empirical analysis using the HUGIN tree clustering approach as well as theoretical analysis related to the random generation of Bayesian
networks using BPART and MPART.