Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Bayesian network

A Bayesian network or Bayesian belief network is a directed acyclic graph of nodes representing variables and arcs representing dependence relations among the variables. If there is an arc from node A to another node B, then we say that A is a parent of B. If a node has a known value, it is said to be an evidence node. A node can represent any kind of variable, be it an observed measurement, a parameter, a latent variable , or a hypothesis. Nodes are not restricted to representing random variables; this is what is "Bayesian" about a Bayesian network.

A Bayesian network is a representation of the joint distribution over all the variables represented by nodes in the graph. Let the variables be X(1), ..., X(n). Let parents(A) be the parents of the node A. Then the joint distribution for X(1) through X(n) is represented as the product of the probability distributions p(X(i) | parents(X(i))) for i from 1 to n. If X has no parents, its probability distribution is said to be unconditional, otherwise it is conditional.

Questions about dependence among variables can be answered by studying the graph alone. It can be shown that the graphical notion called d-separation corresponds to the notion of conditional independence: if nodes X and Y are d-separated (given specified evidence nodes), then variables X and Y are independent given the evidence variables.

In order to carry out numerical calculations, it is necessary to further specify for each node X the probability distribution for X conditional on its parents. The distribution of X given its parents may have any form. However, it is common to work with discrete or Gaussian distributions, since that simplifies calculations.

The goal of inference is typically to find the conditional distribution of a subset of the variables, conditional on known values for some other subset (the evidence), and integrating over any other variables. Thus a Bayesian network can be considered a mechanism for automatically constructing extensions of Bayes' theorem to more complex problems.

Bayesian networks are used for modelling knowledge in gene regulatory networks, medicine, engineering, text analysis , image processing, and decision support systems.

Learning the structure of a Bayesian network is a very important part of machine learning. Given the information that the data is being generated by a Bayesian network and that all the variables are visible in every iteration, the following methods are used to learn the structure of the acyclic graph and the conditional probability table associated with it. The elements of a structure finding algorithm are a scoring function and a search strategy . An exhaustive search returning back a structure that maximizes the score is one implementation which is superexponential in the number of variables. A local search algorithm makes incremental changes aimed at improving the score of the structure. A global search algorithm like Markov chain Monte Carlo does not get trapped in local minima. Friedman et. al. talk about using mutual information between variables and finding a structure that maximizes this. They do this by restricting the parent candidate set to k nodes and exhaustively searching therein.

See also

References

  • Kevin Murphy. An introduction to graphical models. 2001. http://www.ai.mit.edu/~murphyk/Papers/intro_gm.pdf
  • Judea Pearl, Stuart Russell. Bayesian Networks. UCLA Cognitive Systems Laboratory, Technical Report (R-277), November 2000.
  • Judea Pearl, Stuart Russell. Bayesian Networks, in M. A. Arbib (Ed.), Handbook of Brain Theory and Neural Networks, pp. 157–160, Cambridge, MA: MIT Press, 2003, ISBN 0262011972.
  • Enrique Castillo, José Manuel Gutiérrez, and Ali S. Hadi. Expert Systems and Probabilistic Network Models. Springer-Verlag, New York, 1997. ISBN 0-387-94858-9
  • Judea Pearl. Fusion, propagation, and structuring in belief networks. Artificial Intelligence 29 (3): 241–288, 1986.
  • J.W. Comley and D.L. Dowe, "Minimum Message Length, MDL and Generalised Bayesian Networks with Asymmetric Languages", chapter 11 in P. Grunwald, M.A. Pitt and I.J. Myung (eds)., Advances in Minimum Description Length: Theory and Applications, M.I.T. Press, April 2005, ISBN 0262072629.
The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy