0% found this document useful (1 vote)
109 views

Unit VI Stochastic Processes: Dr. Nita V. Patil Date:27/July/2021

This document outlines a unit on stochastic processes and provides examples of different types of stochastic processes. It begins by defining a stochastic process as a collection of random variables indexed by time. Examples are given of discrete-time and continuous-time processes. The document then classifies stochastic processes based on whether the parameter and state spaces are discrete or continuous. Several examples of stochastic processes from areas such as consumer behavior, queueing systems, and population growth are described in detail. The last section discusses Markov processes and chains, defining them as stochastic processes where the next state depends only on the current state.

Uploaded by

abcd
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
109 views

Unit VI Stochastic Processes: Dr. Nita V. Patil Date:27/July/2021

This document outlines a unit on stochastic processes and provides examples of different types of stochastic processes. It begins by defining a stochastic process as a collection of random variables indexed by time. Examples are given of discrete-time and continuous-time processes. The document then classifies stochastic processes based on whether the parameter and state spaces are discrete or continuous. Several examples of stochastic processes from areas such as consumer behavior, queueing systems, and population growth are described in detail. The last section discusses Markov processes and chains, defining them as stochastic processes where the next state depends only on the current state.

Uploaded by

abcd
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

Unit VI Stochastic Processes

Dr. Nita V. Patil


Date:27/July/2021
Outline of the unit
• Definitions and classifications of Stochastic Processes,
• discrete and continuous Markov models,
• Chapman-Kolmogorov equation
Stochastic Processes
Definition:
Stochastic process or random process is a collection of random variables
ordered by an index set

Examples
1. X(t) = the number of customers in line at the post office at time t.
2. X(t) = the price of IBM stock at time t.
☛ Example 1. Random variables X0, X1, X2, . . . form a stochastic process
ordered by the discrete index set {0, 1, 2, . . . }.
Notation: {Xn : n = 0, 1, 2, . . . }.

☛ Example 2. Stochastic process {Yt : t ≥ 0}. with continuous index set {t : t ≥ 0}.

The indices n and t are often referred to as ”time”, so that Xn is a discrete-time


process and Yt is a continuous-time process.

The range (possible values) of the random variables in a stochastic process is


called the state space of the process.
Examples
3. Xn : four types of transactions a person submits to an on-line database
service
Notation : {Xn : n = 0, 1, 2, . . . }, where
the state space of Xn is {0, 1, 2, 3, 4} represents four types of transactions
a person submits to an on-line database service,
and time n corresponds to the number of transactions submitted.
4. Xn : an electronic component is acceptable or defective
{ Xn : n = 0, 1, 2, . . . }, where
the state space of Xn is {1, 2} representing whether an electronic
component is acceptable or defective,
and time n corresponds to the number of components produced.
☛ 5. Yt : a number of accidents that have occurred at an intersection
{ Yt : t ≥ 0}, where
the state space of Yt is {0, 1, 2, . . . } representing the number of
accidents that have occurred at an intersection,
and time t corresponds to weeks.
6. Yt : representing the number of copies of a software product in inventory
{Yt : t ≥ 0}, where
the state space of Yt is {0, 1, 2, . . . , s} representing the number of copies of a
software product in inventory,
and time t corresponds to days.
7. Yt : the number of cars parked in a parking garage at a shopping mall
{Yt : t ≥ 0}, where
the state space of Yt is {0, 1, 2, . . . } representing the number of cars parked
in a parking garage at a shopping mall,
and time t corresponds to hours.
Classification of stochastic processes

Stochastic Processes with

• Discrete Parameter and State Spaces

• Continuous Parameter and Discrete State Space

• Discrete Parameter and Continuous State Space


• Continuous Parameter and State Spaces
Examples
1. A Brand-Switching Model for Consumer Behavior
Before introducing a new brand of coffee, a manufacturer wants to study
consumer behavior relative to the brands already available in the market.
Suppose there are three brands on sale, say A, B, C.
In a survey, conducted over a period of time, suppose the estimates obtained
for the consumer brand-switching behavior are as follows:
a. Out of those why buy A is one month, during the next months 60% buy A
again, 30% switch to brand B and 10% switch to brand C.
b. For brand B these figures are, B to A 50%, B to B 30%, B to C 20%,
c. and for brand C , C to A 40%, C to B 40%, C to C 20%.
1. If we are interested in the number of people who buy a certain brand of
coffee, then that number could be represented as a stochastic process.
2. The behavior of the consumer can also be considered a stochastic process
that can enter three different states A, B, C.
Some of the questions that arise are:
a. What is the expected number of months that a consumer stays with one
specific brand?
b. What are the mean and variance of the number using a particular brand
after a certain number of months?
c. Which is the product preferred most by the customers in the long run?
Suppose, consumer preferences are observed on a monthly basis.
Then What are states?
States- A to A, A to B etc.
What is time?
Time - monthly basis – one month , two months etc.

Then , we have a discrete-time, discrete-state stochastic process


2. A Queueing Problem:
A bus taking students back and forth between the dormitory complex and
the student union arrives at the student union several times during the day.
The bus has a capacity to seat K persons.
If there are K or fewer waiting persons when the bus arrives, it departs with
the available number.
If there are more than K waiting, the driver allows the first K to board, and
the remaining persons must wait for the next bus.
The university administrators would like to know
at each time period during the day how many students are left behind when
a bus departs.
Is this the stochastic process?
The number waiting at the bus stop is a stochastic process dependent
on the arrival process (e.g., with some probability distribution).
Some desirable characteristics of the system are:
• the reduction in the number waiting at any time,
• minimizing the time a student has to wait for a ride,
• and minimum operational cost.
Consider the number of students waiting at the time of arrival of a bus
when time is counted by the number of times the bus arrives.
state – number of students waiting (e.g. 10,12, etc.)
time- the number of times the bus arrives(e.g. , 2,3 etc.)
Then it is discrete time, discrete-state stochastic process.
In queuing problem
consider the number of students waiting for a bus at any time of day
state – number of students waiting (e.g. 10,12, etc.)
time- at any time of day the bus arrives(e.g. betweem 9.00 am to 3 pm)

Then it is continuous-time, discrete state stochastic process.


Example
A Population Growth Consider the size of a population at a given time

we have again a continuous-time, discrete state stochastic process (the


population is finite).
Consider the values of the Dow-Jones Index at the end of the nth week.

Then we have a discrete-time stochastic process with the continuous


state space (0,∞).
Jobs of varied length come to a computing center from various sources.
The number of jobs arriving, as well as their length, can be said to follow certain
distributions.
Under these conditions the number of jobs waiting at any time and the time a job
has to spend in the system can be represented by stochastic processes.
Under a strictly first-come, first-served policy, there is a good chance of a long job
delaying a much more important shorter job over a long period of time.
For the efficient operation of the system, in addition to minimizing the number of
jobs waiting and the total delay, it may be necessary to adopt a different service
policy.
A round-robin policy in which the service is performed on a single job only for a
certain length of time, say 3 or 5 sec, and those jobs that need more service are put
back in the queue, is one of the common practices adopted under these conditions.
In this example consider waiting time of an arriving job until it gets into
service, with the arriving time for the job now the parameter.

It is Stochastic Processes with Continuous Parameter and State Spaces


Stochastic Process
Background
Random variable X is a variable whose value is defined as the outcome of a random phenomenon.
e.g. as the outcome of rolling a dice (number)
as the output of flipping a coin ( 0 to head and 1 to tail).
• the space of possible outcomes of a random variable can be discrete or continuous.
• assigns a number to each outcome s in a sample space S

Stochastic Process or random process is a collection of random variables indexed by a set t (that represent different instants
of time : can be discrete time or continuous time random process).

e.g. flipping a coin every day defines a discrete time random process

the price of a stock market option varying continuously defines a continuous time random process.

The random variables at different instant of time can be independent to each other (coin flipping example) or dependent in
some way (stock price example) as well as they can have continuous or discrete state space (space of possible outcomes at
each instant of time).
Markov Process/ Markov Chain
• A sequence of events in which an event depends upon the immediate preceding
event only is called as Markov Process or Markov Chain
e.g. 1. the market share of the product during month
2. condition of the machine to be used for production each week.

• A Markov chain is a stochastic model describing a sequence of possible events in


which the probability of each event depends only on the state attained in the
previous event.

• A Markov chain is a Markov process with discrete time and discrete state space.

In continuous-time, it is known as a Markov process.


Each Markov chain/ process consists of possible states.
e.G Consider the Markov chain
“condition of the machine to be used for production each week”
Here,
‘working’, ‘fairly working’, ‘not working’ conditions represent the states
of the machine.
The states are finite in number and are collectively exhaustive and
mutually exclusive.
Markov Process analysis is used to study the Probabilities corresponding to
the states at any given time period, considering the movements from one
state to another.
Transition Probabilities: The probabilities of the system to change from state
(i) to the state (j) is called as transition probability.
Transition Matrix(P): The matrix representing the states in one period (rows)
and the states in next period(columns), along with the transition
probabilities between them is called as transition matrix(P)
If over a time it is found that 70% of the customers using brand A continue to use it
next year while 20% shift to brand B and 10% to C. Similarly 60% of customers using
brand B continue to use it while 25% change it to A and 15% shift to C and for C,
75% are retained while 20% are lost to A and 5% to B
→Transition probability from A to A (i.e. Probability) =
Transition probability from A to B =
Transition probability from A to C =
Transition probability from B to A =
Transition probability from B to B (i.e. Probability) =
Transition probability from B to C =
Transition probability from C to A =
Transition probability from C to B =
Transition probability from C to C (i.e. Probability) =
Transition Matrix
Transition Diagram
Probability Tree Diagram
Initial Condition
It represents the probabilities for the various states for the initial
period of time.
e.g. If initially, the market shares of three brands of soft drinks A, B and
C are 60%, 30%, and 10% respectively, then the initial condition for the
period n = 0 is given by
R0= 0.6 0.3 0.1
If the state probabilities at a given period of time depends only upon
those of preceding immediate period, it is first order Markov chain.
Uses of Markov Chain
1. Marketing
2. Finance
3. Production
4. Personnel
Ex: There are three competing daily newspapers in a small city.
No other daily has any effect on the current market. A survey
of the last month reveals the following data.
Newspaper Customers on Day 1 Changes during the month Customers on Last Day
Gains Losses
TIME 800 250 350 700
EXPRESS 400 275 200 475
DAWN 500 150 125 525

Further analysis resulted in the gain-loss summary as follows

Newspaper Gains from Losses to


TIME EXPRESS DAWN TIME EXPRESS DAWN
TIME 0 150 100 0 250 100
EXPRESS 250 0 25 150 0 50
DAWN 100 50 0 100 25 0
Find the current rate of gains and losses for newspapers.
Consider the market shares of three brands of soft drinks A, B and C are 60%, 30%, and 10%
respectively. Also let their transition probability matrix for a year is given by

1. What is their market share after one year?


2. What is their market share after two years?
H. W.
Suppose that an orange juice company controls 20% of the orange juice
market. Suppose they hire a market research company to predict the effects
of an aggressive ad campaign. Suppose they conclude
- Someone using Brand A will stay with Brand A with 90% probability
- Someone not using Brand A will switch to brand A with 70% probability.
If people buy orange juice once a week find the probability transition matrix
and draw a transition diagram and
Calculate the probability that someone uses brand after
1. One week
2. Two weeks
3. Three weeks
Chapman-Kolmogorov
Equations
We know one step transition probabilities matrix consists of
probabilities of transition of a process from state i to state j.
eg

The Chapman-Kolmogorov equations provide a method for computing


these n-step transition probabilities.
These equations are
---------1
Where represents the probability that
starting in i the process will go to state j
in n + m transitions
through a path which takes it into state k at the nth transition.

Hence, summing over all intermediate states k yields the probability


that the process will be in state j after n + m transitions.
Example: (Forecasting the Weather):
Suppose that the chance of rain tomorrow depends on previous weather
conditions only through whether or not it is raining today and not on past weather
conditions.
Suppose also that if it rains today, then it will rain tomorrow with probability α; and
if it does not rain today, then it will rain tomorrow with probability β.
If we say that the process is in state 0 when it rains and state 1 when it does not
rain, then the above is a two-state Markov chain whose transition probabilities are
given by
• Considering this example in which the weather is considered as a
two-state Markov chain. If α = 0.7 and β = 0.4, then calculate the
probability that it will rain four days from today given that it is
raining today.
Solution :
The one-step transition probability matrix is given by
Hence,

The probability that it will rain four days from today given that it is raining today is 0.5749.
Example:
Suppose that whether or not it rains today depends on previous weather conditions through the last two days.
Specifically, suppose that
• if it has rained for the past two days, then it will rain tomorrow with probability 0.7;
• if it rained today but not yesterday, then it will rain tomorrow with probability 0.5;
• if it rained yesterday but not today, then it will rain tomorrow with probability 0.4;
• if it has not rained in the past two days, then it will rain tomorrow with probability 0.2.
Consider that the state at any time is determined by the weather conditions during both that day and the
previous day.
In other words,
the process is in state 0 if it rained both today and yesterday,
state 1 if it rained today but not yesterday,
state 2 if it rained yesterday but not today,
state 3 if it did not rain either yesterday or today.
Considering this example, given that it rained on Monday and Tuesday, What is the probability that it will rain
on Thursday?
then it is represented as a four-state Markov chain having a transition
probability matrix
if it has rained for the past two days, then it will rain tomorrow with
0 1 2 3 probability 0.7;
0 0.7 0 0.3 0
if it rained today but not yesterday, then it will rain tomorrow with
P(2)=P2=1 0.5 0 0.5 0 probability 0.5;
2 0 0.4 0 0.6
3 0 0.2 0 0.8 if it rained yesterday but not today, then it will rain tomorrow with
probability 0.4;
if it has not rained in the past two days, then it will rain tomorrow
The process is in with probability 0.2.
state 0 if it rained both today and yesterday,
state 1 if it rained today but not yesterday,
state 2 if it rained yesterday but not today,
state 3 if it did not rain either yesterday or today.
Solution: The two-step transition matrix is given by
End of the Syllabus

You might also like