0% found this document useful (0 votes)
9 views

Week 9

The document contains a series of questions and answers related to Bayesian networks and sampling methods, specifically focusing on rejection sampling, likelihood sampling, and MCMC with Gibbs sampling. It includes calculations for probabilities and properties of Bayesian networks, as well as methods for learning their structure. Additionally, it provides specific numerical answers to various problems posed in the context of artificial intelligence assignments.

Uploaded by

Ankur Verma
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Week 9

The document contains a series of questions and answers related to Bayesian networks and sampling methods, specifically focusing on rejection sampling, likelihood sampling, and MCMC with Gibbs sampling. It includes calculations for probabilities and properties of Bayesian networks, as well as methods for learning their structure. Additionally, it provides specific numerical answers to various problems posed in the context of artificial intelligence assignments.

Uploaded by

Ankur Verma
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Please do not message repeatedly. You will get the answer before the deadline.

0
Category Search Your Courses.... My Account

 [Week 1-12] NPTEL An Introduction to Artificial Intelligence Assignment Answers 2022 

About Lesson
Q1. Consider the following Bayesian network with binary variables.

Calculate the probability P(a | d, e) using rejection sampling, given the following
samples. Return the answer as a decimal rounded to 2 decimal points (for example, if
it is 0.333, return 0.33).

Accepted Answers:
(Type: Numeric) 0.75
Q2. Which of the following statements are true?
Rejection sampling samples from the prior distribution
Rejection sampling samples from the posterior distribution
Likelihood sampling samples from the prior distribution
Likelihood sampling samples from the posterior distribution

Accepted Answers:
Rejection sampling samples from the prior distribution
Q3. Which of the following properties are valid for the environment of the Turing
Test?
Fully observable
Multi-Agent
Dynamic
Stochastic

Accepted Answers:
Multi-Agent
Dynamic
Stochastic
Q4. Consider the following Bayesian Network. Suppose you are doing likelihood
weighting to determine P(s|¬w,c).

What is the weight of the sample (c, s,r, ¬w)?


Return the answer as a decimal rounded to 3 decimal points (for example, if it is
0.1234, return 0.123)
Accepted Answers:
(Type: Numeric) 0.005
Q5. Suppose we use MCMC with Gibbs sampling to determine P(s|w) in the above
problem. Which of the following are correct statements in this case?

We might need to calculate P(w|¬s,c,r) during the sampling process.


The relative frequency of reaching the states with S assigned true after
sufficiently many steps will provide an estimate of P(s|w)
We can get a reliable estimate of probability by using the first few samples
only.
Sampling using MCMC is asymptotically equivalent to sampling from the
prior probability distribution.

Accepted Answers:The relative frequency of reaching the states with S


assigned true after sufficiently many steps will provide an estimate of
P(s|w)
Q6. Consider the following Bayesian Network. What is the Markov Blanket of C?
Return the answer as a lexicographically sorted string (for example, if the blanket
consists of the nodes A, D and C return ACD)

Accepted Answers:
(Type: String) ABDEFG
Q7. Which of the following provides a plausible way to learn the structure of Bayesian
networks from data?
Bayesian learning
Local search in the space of possible structures
MAP
MLE

Accepted Answers:
Local search in the space of possible structures
Q8.
Consider the following Bayesian Network, where each variable is binary.

We have the following training examples for the above Bayesian net where two
examples contain unobserved values (denoted by ?). All of the parameters of the
Bayesian network are set at 0.5 initially, except for P(b) and P(c|¬a,¬b), which are
initialised to 0.8. What is the value of P(c|a,b) after simulating the second M step of
the simple (hard) EM algorithm? If the answer is the fraction m/n where m and n have
no common factors, return m+n. (eg. 3 if the answer is 2/4)

Accepted Answers:(Type: Numeric) 2


Q9. Ram is given a possibly biased coin and he is supposed to estimate the
probability of it turning heads. Ram tosses the coin 5 times and gets head 3 times.
Suppose that Ram uses maximizing likelihood as the learning algorithm for this task.
What, according to him, is the probability of getting heads for the coin?
Give the answer rounded off to 1 decimal place

Accepted Answers:
(Type: Numeric) 0.6
Q10. Suppose that Ram has a prior that the probability of the coin turning head is one
of 0.4 (case 1), 0.5 (case 2), 0.6 (case 3) with probability 1/3 each. Ram tosses the coin
once and gets head. What is the posterior probability of case 1 given this
observation?

Give the answer rounded off to 2 decimal places.


Accepted Answers:
(Type: Numeric) 0.27

 Previous Next 

0% Complete
Mark as Complete
Quick Links Get In Touch
Login Add. : 4, Shivpuri Road No 1B,
With 5+ years of experience, Answer
GPT helps students by providing clear About Us
and accurate assignment solutions for Shivpuri, Patna, Bihar – 800023
Contact Us
NPTEL courses. We are dedicated to
supporting students in their studies and Disclaimer
helping them succeed. Privacy Policy Email: [email protected]
Refund Policy
Shipping Policy Hours: Mon-Fri 9:00AM - 6:00PM
Terms & Conditions

AnswerGPT - © 2024 All Rights Reserved.

You might also like