0% found this document useful (0 votes)
55 views

Machine Learning 2: Exercise Sheet 6

This document contains exercises for a machine learning course. It includes 3 exercises: 1. The first exercise asks students to show a relation holds for the forward problem in a Markov model, which can be represented as a joint probability distribution over states at each time step. 2. The second exercise asks students to show a similar relation for the forward problem in a hidden Markov model, which can be represented as a joint distribution over hidden states and observations at each time step. 3. The third exercise instructs students to download programming files and follow instructions to complete a programming exercise.

Uploaded by

Surya Iyer
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views

Machine Learning 2: Exercise Sheet 6

This document contains exercises for a machine learning course. It includes 3 exercises: 1. The first exercise asks students to show a relation holds for the forward problem in a Markov model, which can be represented as a joint probability distribution over states at each time step. 2. The second exercise asks students to show a similar relation for the forward problem in a hidden Markov model, which can be represented as a joint distribution over hidden states and observations at each time step. 3. The third exercise instructs students to download programming files and follow instructions to complete a programming exercise.

Uploaded by

Surya Iyer
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Exercises for the course Abteilung Maschinelles Lernen

Institut für Softwaretechnik und theoretische Informatik


Machine Learning 2 Fakultät IV, Technische Universität Berlin
Summer semester 2021 Prof. Dr. Klaus-Robert Müller
Email: [email protected]

Exercise Sheet 6
Exercise 1: Markov Model Forward Problem (20 P)
A Markov Model can be seen as a joint distribution over states at each time step q1 , . . . , qT where qt ∈
{S1 , . . . , SN }, and where the probability distribution has the factored structure:
T
Y
P (q1 , . . . , qT ) = P (q1 ) · P (qt |qt−1 )
t=2

Factors are the probability of the initial state and conditional distributions at every time step.

(a) Show that the following relation holds:


N
X
P (qt+1 = Sj ) = P (qt = Si )P (qt+1 = Sj |qt = Si )
i=1

for t ∈ {1, . . . , T − 1} and j ∈ {1, . . . , N }.

Exercise 2: Hidden Markov Model Forward Problem (20 P)


A Hidden Markov Model (HMM) can be seen as a joint distribution over hidden states q1 , . . . , qT at each
time step and corresponding observation O1 , . . . , OT . Like for the Markov Model, we have qt ∈ {S1 , . . . , SN }.
The probability distribution of the HMM has the factored structure:
T
Y T
Y
P (q1 , . . . , qT , O1 , . . . , OT ) = P (q1 ) · P (qt |qt−1 ) · P (Ot |qt )
t=2 t=1

Factors are the probability of the initial state and conditional distributions at every time step.

(a) Show that the following relation holds:


N
X
P (O1 , . . . , Ot , Ot+1 , qt+1 = Sj ) = P (O1 , . . . , Ot , qt = Si )P (qt+1 = Sj |qt = Si )P (Ot+1 |qt+1 = Sj )
i=1

for t ∈ {1, . . . , T − 1} and j ∈ {1, . . . , N }.

Exercise 3: Programming (60 P)


Download the programming files on ISIS and follow the instructions.

You might also like