CS 188 Sp07 Discussion Note Week 8 – Discrete Bayesian Network Inference
by Nuttapong Chentanez
Independence in BN: Are two nodes conditionally independent given certain evidences?
Causal Chain
Are X and Z always independent?
Are X and Z independent given Y?
Common Cause
X = Newsgroup busy, Y = Project due, Z = Lab full
Is X and Z independent given Y?
Common Effect
X: CS188 project due, Z: CS184 project due, Y: Lack of sleep
Is X and Z independent?
Is X and Z independent given Y? General Case:
Bayes Ball Algorithm :
Example on board:
Exact Inference
Inference by enumeration
We can use numbers from CPT and to evaluate the above equation.
A slightly better idea
Push summation as far inside as possible, this would reduce number of multiplication.
The evaluation tree below shows how we evaluate the expression.
Notice that and are evaluated twice
Variable Elimination
-Reduce redundant work.
-Trace objects by factors.
- Subscripts denotes which variables are used to produce the factor
- Subscripts with bar indicates the variables that are summed over
- Arguments of factors are the free variables
- Product of factors is point-wise product, eliminating an argument
- Similar to database joint
Detailed example on board.
Approximate Inference
Sampling
- Draw N samples from a sampling distribution
- Compute an approximate posterior probability
- Show that it converge to the true probability for large N
Prior Sampling
Sample Cloudy first, then sample Sprinkler and Rain depending on the sampled Cloudy, then sample WetGrass depending on sampled Sprinkler and Rain.
Probability that a sample is generated from this method:
Probability of an observation, based on the experiment
As N∞, they are equal
Therefore, we can use experimental probability to estimate the prior probability.
Example:
c, s, r, w P(W) ?
c, s, r, w P(C | r)?
c, s, r, w P(C| r, w)?
c, s, r, w
c, s, r, w
Rejection Sampling (when we have evidences)
To find P(C), don’t need to keep all the samples, just need to count.
For P(C|r), just count only those that r
Problem: May take long to get any sample with consistent evidence
Likelihood weighting
Generate only samples that consistent with evidences and weighted them so that the estimate still reflect the true probability.
Eg. P(C | s)
c, s, r, w weight = 0.1
c, s, r, w weight = 0.5
P(C | s) = <1 x 0.1, 1 x 0.5> = (1/6, 5/6)
Sampling distribution if z sampled and e fixed: Samples have weights:
Together, weighted sampling distribution is consistent
=
Exercise
1.
A simple Bayes net with Boolean variables I = Intelligent, H =Honest,
P =Popular, L=LotsOfCampaignFunds, E =Elected.
a. Which of the followings are asserted by the network (ignoring CPT)?
b. Calculate P(i, h, ~l, p, ~e)
c. Calculate the probability that someone is intelligent given that they are honest, have few campaign funds, and are elected.
d. True/False If there are two candidates in the race, then making two copies of the network will correctly represent the joint distribution over the two sets of variables.
2.
Is X2 X3 | {X1, X6}? How about X1 X6 | {X2, X3}?