Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Oppiskele Inference in Bayesian Networks | Bayesian Networks: Directed Models
Probabilistic Graphical Models Essentials

bookInference in Bayesian Networks

In Bayesian networks, the main goal of inference is to compute marginal or conditional probabilities for variables of interest, possibly given some observed evidence. For example, you may want to know the probability that the grass is wet, or that it rained, given that the sprinkler was on. These queries allow you to reason about the likelihood of events in the world, based on the structure and parameters of your network.

1234567891011121314151617181920212223242526272829
# Brute-force enumeration for P(WetGrass=True) in the Rain-Sprinkler-WetGrass network # Variables: Rain (R), Sprinkler (S), WetGrass (W) # Structure: Rain -> Sprinkler, Rain -> WetGrass, Sprinkler -> WetGrass # Conditional Probability Tables (CPTs) P_Rain = {True: 0.2, False: 0.8} P_Sprinkler_given_Rain = {(True,): 0.01, (False,): 0.4} P_WetGrass_given_Sprinkler_Rain = { (True, True): 0.99, (True, False): 0.9, (False, True): 0.8, (False, False): 0.0 } def joint_prob(r, s, w): p_r = P_Rain[r] p_s = P_Sprinkler_given_Rain[(r,)] if s else 1 - P_Sprinkler_given_Rain[(r,)] p_w = P_WetGrass_given_Sprinkler_Rain[(s, r)] if w else 1 - P_WetGrass_given_Sprinkler_Rain[(s, r)] return p_r * p_s * p_w # Compute P(WetGrass=True) by summing over all hidden variables (Rain, Sprinkler) prob = 0.0 for r in [True, False]: for s in [True, False]: w = True prob += joint_prob(r, s, w) print(f"P(WetGrass=True) = {prob:.4f}")
copy

This brute-force approach works by summing over all possible assignments of the hidden variables, in this case Rain and Sprinkler, while fixing WetGrass=True. For each possible combination of Rain and Sprinkler, the code computes the joint probability using the chain rule as defined by the network's structure: first the prior for Rain, then the probability of Sprinkler given Rain, and finally the probability of WetGrass given both Sprinkler and Rain. The sum of these joint probabilities gives the desired marginal probability for WetGrass=True.

question mark

What best describes how the brute-force approach works for computing marginal probabilities in a Bayesian network?

Select the correct answer

Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 2. Luku 3

Kysy tekoälyä

expand

Kysy tekoälyä

ChatGPT

Kysy mitä tahansa tai kokeile jotakin ehdotetuista kysymyksistä aloittaaksesi keskustelumme

bookInference in Bayesian Networks

Pyyhkäise näyttääksesi valikon

In Bayesian networks, the main goal of inference is to compute marginal or conditional probabilities for variables of interest, possibly given some observed evidence. For example, you may want to know the probability that the grass is wet, or that it rained, given that the sprinkler was on. These queries allow you to reason about the likelihood of events in the world, based on the structure and parameters of your network.

1234567891011121314151617181920212223242526272829
# Brute-force enumeration for P(WetGrass=True) in the Rain-Sprinkler-WetGrass network # Variables: Rain (R), Sprinkler (S), WetGrass (W) # Structure: Rain -> Sprinkler, Rain -> WetGrass, Sprinkler -> WetGrass # Conditional Probability Tables (CPTs) P_Rain = {True: 0.2, False: 0.8} P_Sprinkler_given_Rain = {(True,): 0.01, (False,): 0.4} P_WetGrass_given_Sprinkler_Rain = { (True, True): 0.99, (True, False): 0.9, (False, True): 0.8, (False, False): 0.0 } def joint_prob(r, s, w): p_r = P_Rain[r] p_s = P_Sprinkler_given_Rain[(r,)] if s else 1 - P_Sprinkler_given_Rain[(r,)] p_w = P_WetGrass_given_Sprinkler_Rain[(s, r)] if w else 1 - P_WetGrass_given_Sprinkler_Rain[(s, r)] return p_r * p_s * p_w # Compute P(WetGrass=True) by summing over all hidden variables (Rain, Sprinkler) prob = 0.0 for r in [True, False]: for s in [True, False]: w = True prob += joint_prob(r, s, w) print(f"P(WetGrass=True) = {prob:.4f}")
copy

This brute-force approach works by summing over all possible assignments of the hidden variables, in this case Rain and Sprinkler, while fixing WetGrass=True. For each possible combination of Rain and Sprinkler, the code computes the joint probability using the chain rule as defined by the network's structure: first the prior for Rain, then the probability of Sprinkler given Rain, and finally the probability of WetGrass given both Sprinkler and Rain. The sum of these joint probabilities gives the desired marginal probability for WetGrass=True.

question mark

What best describes how the brute-force approach works for computing marginal probabilities in a Bayesian network?

Select the correct answer

Oliko kaikki selvää?

Miten voimme parantaa sitä?

Kiitos palautteestasi!

Osio 2. Luku 3
some-alt