3 3-BayesianNetworks
3 3-BayesianNetworks
Representation
Nodes represent variables in the Bayesian sense:
observable quantities, hidden variables or hypotheses.
Edges represent conditional dependencies.
Nodes that are not connected by any path represent variables that are conditionally
independent of each other.
e.g. Sprinkler and Rain in the related example are conditionally independent
P(Sprinkler|Cloudy, Rain) = P(Sprinkler|Cloudy)
A
Sequence
B
C
A B
Convergence
A
Divergence
B C
Problemsolving for this Representation
A Bayesian network is a complete probabilistic model of the variables and their H H
relationships describing effects in terms of causes.
Inferences typically aims to update our beliefs concerning causes in the light of new
evidence. P(E|H) P(H|E)
Bayesian inference derives the posterior probability of the subset of hypothesis variables
through systematic use of Bayes theorem : P(H|E) = P (E|H) * P(E) / P(H).
Inferences of this kind can be recursively applied throughout the whole structure.
Learning for this Representation
PARAMETER learning
• Learning parameters (conditional probabilities)
given a fixed variable structure
STRUCTURE learning
• Learning the variable structure
• Learning hidden variables (non observables)
NPTEL
Neural Networks