Bayesian Networks II
Problem 2.1¶
Consider the following joint distribution for three binary random variables :
| b | a | c | |
|---|---|---|---|
| 0 | 0 | 0 | 0.192 |
| 0 | 0 | 1 | 0.144 |
| 0 | 1 | 0 | 0.048 |
| 0 | 1 | 1 | 0.216 |
| 1 | 0 | 0 | 0.192 |
| 1 | 0 | 1 | 0.064 |
| 1 | 1 | 0 | 0.048 |
| 1 | 1 | 1 | 0.096 |
(a) Show that and are dependent, i.e. . (b) Show that and are conditionally independent given , i.e.
Problem 2.2¶
Conditional independence properties entailed by a Bayesian Network can be read directly from the graph, using the notion of d-separation. Given the following Bayesian Network (Graph nodes: A, B, C, D, E, F, G, H, I, J), which of the following conditional independence statements hold?
a) b) c) d) e) f) g)
Problem 2.3¶
In the Cyanobacteria example, we have the following variables: where
Temperature:
Presence of fertilizer in water:
Presence of cyanobacteria in water:
Fish mortality:
Water color:
The joint factorization is
Assume the probability distributions are given by:
The conditional probability table (CPT) for is:
The conditional probability table (CPT) for is:
| 0.6 | 0.1 | |
| 0.4 | 0.9 |
The conditional probability table (CPT) for is:
| 0.7 | 0.2 | |
| 0.3 | 0.8 |
We want to compute the posterior probability of fish mortality given colored water:
a) Eliminate variables sequentially by working with its chain-rule factorization. The joint probability is
where computing corresponds to
Express the procedure how to compute by indicating the dimensions of the conditional probability tables.
b) Compute the conditional probability table (CPT) for . c) Compute the conditional probability table (CPT) for . d) Compute the conditional probability table (CPT) for . e) Compute the probability . f) Compute the conditional probability table (CPT) for . g) Compute the conditional probability table (CPT) for . h) Compute the probability table for . i) Compute . j) Compute the conditional probability . k) Compute the conditional probability by means of pgmpy.
l) MLE and MAP (Laplace smoothing): For the conditional probability : (i) give the maximum-likelihood estimator (MLE) in terms of counts, and (ii) give the MAP estimator using Laplace smoothing (add-one).
m) Handling missing entries - expected counts for MAP: Given the dataset with missing entries (observations ), write the symbolic expression for the expected count used in the expectation-maximization / MAP step. Explain briefly how this expectation is computed for: (i) an observation with no missing entries, (ii) an observation with some missing variables (show one short example).
Problem 2.4¶
Suppose you wish to perform variable elimination on the Bayesian Network shown in the graph (Nodes: A, B, C, D, E, F, G, H, I, J). Consider the following variable elimination ordering: A, B, C, D, E, F, G, H, I, J.
For each iteration of the algorithm (i.e., for each variable in the ordering), determine which factors are removed and which new factors are introduced. As an example, in the first iteration (eliminating variable A), the factors and are removed, and a new factor is introduced.
Problem 2.5¶
In this exercise, you will use the variable elimination algorithm to perform inference on a Bayesian Network. Consider the network with nodes A, B, C, D, E, F and the corresponding CPTs:
(a) :
| A | C | |
|---|---|---|
| f | f | 0.2 |
| f | t | 0.8 |
| t | f | 0.3 |
| t | t | 0.5 |
(b) :
| C | |
|---|---|
| f | 0.9 |
| t | 0.75 |
(c) :
| B | |
|---|---|
| f | 0.2 |
| t | 0.4 |
(d) :
| D | E | |
|---|---|---|
| f | f | 0.95 |
| f | t | 1.00 |
| t | f | 0.00 |
| t | t | 0.25 |
Assuming a query on A with evidence for B and D, i.e. computing , use the variable elimination algorithm to answer the following queries: a) b) c)
Consider now the variable elimination ordering: C, E, F, D, B, A. Use again the variable elimination algorithm and write down the intermediate factors, this time without computing their probability tables. Discuss whether this ordering is better or worse than the one used previously, and explain why.
Problem 2.6¶
In the Cyanobacteria example (variables ), the joint factorization is:
a) For the conditional probability : (i) give the maximum-likelihood estimator (MLE) in terms of counts. (ii) give the MAP estimator using Laplace smoothing (add-one).
b) We have a partially observed Bayesian Network. We want to estimate . Estimate the count by the expected count . (i) For , compute . (ii) For , compute . (iii) For , compute . (iv) Indicate in symbolic notation the MAP using the expected counts obtained.
c) Read in cyanobacteria_data.csv and learn the parameters using pgmpy.
(i) Learn parameters without smoothing.
(ii) Learn parameters with smoothing.
Predict .
d) Assume a dataset with unobserved fertilizer (F): cyanobacteria_unobserved_fertilizer.csv. Using Expectation Maximization (EM), estimate the values of the unobserved fertilizer variable using pgmpy. Then predict .
Problem 2.7¶
Consider the following causal DAG: (Genotype) (Smoking) (Genotype) (Lung Cancer) (Smoking) (Tar Deposits) (Tar Deposits) (Lung Cancer)
We define the joint distribution (marginalizing over U implicitly in the provided CPTs for the exercise):
:
| Z=0 | Z=1 | |
|---|---|---|
| X=0 | 0.95 | 0.05 |
| X=1 | 0.05 | 0.95 |
:
| Y=0 | Y=1 | |
|---|---|---|
| Z=0 | 0.14 | 0.86 |
| Z=1 | 0.24 | 0.76 |
a) Construct the Bayesian network in pgmpy. b) Learn the parameters from the given CPTs. c) Compute the probabilistic (associational) query . d) (Optional) Explain briefly why this differs from the causal effect of smoking on lung cancer.
Solutions¶
Solution 2.1 a) & b) Calculations provided showing but .
Solution 2.2 a) No, b) Yes, c) No, d) Yes, e) Yes, f) No, g) No.
Solution 2.3 (a - k) Calculations for Variable Elimination steps provided. Result for j: , .
Solution 2.6 (Python Code Snippet)
# Compute p(m | w = "green") using pgmpy
from pgmpy.models import DiscreteBayesianNetwork
from pgmpy.factors.discrete import TabularCPD
from pgmpy.inference import VariableElimination
# 1. Define the Bayesian Network structure
model = DiscreteBayesianNetwork([
('T', 'C'),
('F', 'C'),
('C', 'M'),
('C', 'W')
])
# 2. Define the Conditional Probability Tables (CPTs)
cpd_T = TabularCPD(variable='T', variable_card=2, values=[[0.4], [0.6]], state_names={'T': ['cold', 'hot']})
cpd_F = TabularCPD(variable='F', variable_card=2, values=[[0.2], [0.8]], state_names={'F': ['yes', 'no']})
cpd_C = TabularCPD(variable='C', variable_card=2,
values=[[0.5, 0.05, 0.95, 0.8], [0.5, 0.95, 0.05, 0.2]],
evidence=['T', 'F'], evidence_card=[2, 2],
state_names={'C': ['yes', 'no'], 'T': ['cold', 'hot'], 'F': ['yes', 'no']})
cpd_M = TabularCPD(variable='M', variable_card=2,
values=[[0.6, 0.1], [0.4, 0.9]],
evidence=['C'], evidence_card=[2],
state_names={'M': ['yes', 'no'], 'C': ['yes', 'no']})
cpd_W = TabularCPD(variable='W', variable_card=2,
values=[[0.7, 0.2], [0.3, 0.8]],
evidence=['C'], evidence_card=[2],
state_names={'W': ['clear', 'green'], 'C': ['yes', 'no']})
# 3. Add CPTs to the model
model.add_cpds(cpd_T, cpd_F, cpd_C, cpd_M, cpd_W)
assert model.check_model()
# 4. Perform inference
infer = VariableElimination(model)
posterior = infer.query(variables=['M'], evidence={'W': 'green'})
print(posterior)Solution 2.7 (Python Code Snippet)
from pgmpy.models import DiscreteBayesianNetwork
from pgmpy.factors.discrete import TabularCPD
from pgmpy.inference import VariableElimination
# Define structure
model = DiscreteBayesianNetwork([("X", "Z"), ("Z", "Y")])
# Define CPTS
cpd_x = TabularCPD("X", 2, [[0.5], [0.5]])
cpd_z = TabularCPD("Z", 2, [[0.95, 0.05], [0.05, 0.95]], evidence=["X"], evidence_card=[2])
cpd_y = TabularCPD("Y", 2, [[0.14, 0.24], [0.86, 0.76]], evidence=["Z"], evidence_card=[2])
# Add to model
model.add_cpds(cpd_x, cpd_z, cpd_y)
model.check_model()
# Inference
infer = VariableElimination(model)
q = infer.query(variables=["Y"], evidence={"X": 1})
# Output: P(Y=1|X=1)=0.73