0% found this document useful (0 votes)
11 views23 pages

SM 5

The document discusses Markov chains, which are processes where future outcomes depend only on the present state, not on past states. It explains key features, applications in various fields like web ranking through Google's PageRank algorithm, and provides examples including a Coke vs. Pepsi purchasing scenario. Additionally, it includes numerical problems and assignments related to Markov chains and their applications.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views23 pages

SM 5

The document discusses Markov chains, which are processes where future outcomes depend only on the present state, not on past states. It explains key features, applications in various fields like web ranking through Google's PageRank algorithm, and provides examples including a Coke vs. Pepsi purchasing scenario. Additionally, it includes numerical problems and assignments related to Markov chains and their applications.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

SIMULATION AND MODELLING

Sujan Karki
Email: [email protected]
Contact No.: 9819387234
Master’s in Information System Engineering (MscIne) *Ongoing
Bachelor in Computer Engineering – Purwanchal Campus,IOE
UNIT 5

Markov Chain

5. Markov chains
1. Key features of Markov chains
2. Markov process with example
3. Application of Markov chain

2
Markov Process
o If the future outcome depends only on the present outcome but not the past
outcomes then the process is said to be Markov Process.
o Markov process models are useful in studying the evolution of systems over
repeated trails or sequence time periods or stages. where systems exhibit
stochastic(random) behavior and state transitions.

o Examples of Markov Process:


1. Airplane at Airport
2. Rainfall
3. Behavior of Business
4. Flow of traffic

3
4
Markov Chains
o Since the system changes randomly , it is generally impossible to predict
the exact state of the system in the future.
o However, the statistical properties of the system’s future can be predicted.
o In many applications it is these statistical properties that Markov Chains
are important.
o Markov chains are used to analyze trends and predict the future. (Weather,
stock market, genetics, product success, etc.
o A Markov chain consists of states and transition probabilities. Each
transition probability is the probability of moving from one state to another
in one step.
o The probability that j is the next event of the chain given that the current
state is i is called the transition probability from i to j.
o These transition probabilities are independent of the past and depend only
on the two states involved.
o The Markov chain has network structure much like that of website, where
each node in the network is called a state and to each link in the network a
transition probability is attached, which denotes the probability of moving
from the source state of the link to
5
its destination state.
Key Features of Markov Chain

A sequence of trials of an experiment is a Markov chain if:


1. The outcome of each experiment is one of a set of discrete
states.
2. The outcome of an experiment depends only on the present
state and not on any past states.
3. The transition probability remain constant from one
transition to the next.

6
Internet Application of Markov Chain
o The PageRank of a webpage as used by Google is defined by a Markov
chain.
o Google’s PageRank (PR) is method of ranking web pages for placement on
a Search Engine Results Page (SERP).
o PageRank is a mathematical formula (algorithm) that Google uses to
calculate the importance of a particular web page/URL based on incoming
links.
o PageRank algorithm assigns each web page a relevancy score.
o It is used to measure the relative importance of a website within it’s set of
hyperlinked pages.
o If we rank better in organic search, then we should get more website traffic
from search engines.

7
Internet Application of Markov Chain
o The Page Rank Algorithm
The original Page Rank algorithm was described by Lawrence Page and
Sergey Brin in several publications.
o It is given by
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))
Where
• PR(A) is the Page Rank of page A,
• PR(Ti) is the Page Rank of pages Ti which link to page A,
• C(Ti) is the number of outbound links on page Ti and
• d is a damping factor which can be set between 0 and 1.

8
We regard a small web consisting of three pages A, B and C, whereby page
A links to the pages B and C, page B links to page C and page C links to
page A.
According to Page and Brin, the damping factor d is usually set to 0.85, but
to keep the calculation simple we set it to 0.5. The exact value of the
damping factor d admittedly has effects on Page Rank, but it does not
influence the fundamental principles of Page Rank.
Initially let the page rank of each page be 1. Calculate iteratively and
conclude which page has the highest score.
Using formula for page rank,
PR(A) = 0.5 + 0.5 PR(C)/1
PR(B) = 0.5 + 0.5 (PR(A) / 2)
PR(C) = 0.5 + 0.5 (PR(A) / 2 + PR(B))
These equations can easily be solved. We get the following Page Rank
values for the single pages:
PR(A) = 14/13 = 1.07692308
PR(B) = 10/13 = 0.76923077
PR(C) = 15/13 = 1.15384615
It is obvious that the sum of all page’s Page Ranks is 3 and thus equals
9 the total number of web pages.
Transition Matrix
o The matrix of transition probabilities is called the transition
matrix. It is a square matrix and is represented by P.
o The transition probability matrix is the matrix that shows
probabilities of moving from one state to another state.
o So the transition matrix for whole Markov chain can be
represented as:

where P11, P12,....Pnn are the transition probabilities.


All the entries of the matrix lie between 0 and 1. The sum of entries
of any row is equal to 1
10
1
1
1
2
1
3
1
4
Numerical
Given that a person’s last Coca-Cola purchase was coke, there is a
90% chance that his next cola purchase will also be coke. If a person’s
last cola purchase was Pepsi, there is an 80% chance that his next cola
purchase will also be Pepsi.
a. Make graph of above problem and construct a transition matrix.
b. Given that a person is currently Pepsi purchaser, what is the
probability that he will purchase Coke two purchases from now?
c. Given that a person is currently a Coke purchaser, what is the
probability that he will purchase Pepsi three purchase from now?
d. Assume each person makes one cola purchase per week. Suppose
60% of all people now drink coke and 40% drink Pepsi, what fraction
of people will be drinking coke three weeks from now?

1
5
Markov Process
Coke vs. Pepsi Example
• Given that a person’s last cola purchase was
Coke, there is a 90% chance that his next cola
purchase will also be Coke.
• If a person’s last cola purchase was Pepsi, there
is an 80% chance that his next cola purchase will
also be Pepsi.

transition matrix: 0.9 0.1


0.8

 0.9 0.1 coke pepsi


P  
 0.2 0.8 0.2

16
Markov Process
Coke vs. Pepsi Example (cont)

Given that a person is currently a Pepsi


purchaser, what is the probability that he will
purchase Coke two purchases from now?
Pr[ Pepsi?Coke ] =
Pr[ PepsiCokeCoke ] + Pr[ Pepsi Pepsi Coke ] =
0.2 * 0.9 + 0.8 * 0.2 = 0.34

2  00.9.9 00.1.1  0.9 0.1  0.83 0.17 


P       
 00.2.2 00.8.8  0.2 0.8  0.34 0.66

Pepsi  ? ?  Coke
17
Markov Process
Coke vs. Pepsi Example (cont)

Given that a person is currently a Coke purchaser,


what is the probability that he will purchase Pepsi
three purchases from now?

 0.9 0.1  0.83 0.17   0.781 0.219


3
P      
 0.2 0.8  0.34 0.66  0.438 0.562

18
Markov Process
Coke vs. Pepsi Example (cont)
•Assume each person makes one cola purchase per week
•Suppose 60% of all people now drink Coke, and 40% drink Pepsi
•What fraction of people will be drinking Coke three weeks from
now?
 0.9 0.1 3  0.781 0.219
P   P  
 0. 2 0 . 8  0 . 438 0 . 562 
Pr[X3=Coke] = 0.6 * 0.781 + 0.4 * 0.438 = 0.6438

Qi - the distribution in week i


Q0=(0.6,0.4) - initial distribution
Q3= Q0 * P3 =(0.6438,0.3562)
19
Numerical

1. Given that a chance of Ford car user to buy a Ford car in next purchase is
70% and that his next purchase is will be a Scorpio is 30% and chance of
Scorpio car user to buy Scorpio car at the next purchase is 80% and chance
that his next purchase will be Ford car is 20%. What is the
probability to buy a Scorpio car after three purchase of a current Ford user?
If 70% user use Ford car today, what percentage of user will use Scorpio after
3 purchase?

2. Given that chance of a Honda bike user to buy Honda bike at next
purchase is 70% and that his next purchase will be Yamaha is 30%. The
chance of Yamaha bike user to buy Yamaha bike at next purchase is 80% and
that his next purchase will be Honda is 20%. What is the
probability to buy Yamaha bike after three purchase of a current Honda
Bike user?

2
0
Applications of Markov Chain
• Used in pattern recognition.
• Used in routing and page rank algorithm.
• Used in weather forecasting.
• Used in stock market prediction.
• Used in forecasting product success.
• Used to study web navigation behavior of internet users.

21
Assignments
1. Explain Markov chain with transition probability. What are the features of
Markov chains? Explain with example. [3+5]
2. Explain the application of Markov chain with an example. [6]
3. Given that change of a Sony user to buy Sony at next purchase is 80% and
that his next purchase will be Samsung is20% and chance of a Samsung
user to buy Samsung at next purchase is 85% and chance that his next
purchase will be Sony is 15%. What is the probability to buy Sony after
three purchase of a current Samsung user ? lf 60% user uses Sony today,
what percentage of user uses Samsung after three purchase? (4+2)

2
2
. . . to be continued !!!

23

You might also like