0% found this document useful (0 votes)
53 views

5C RV Simulation New

The document discusses Monte Carlo simulation techniques for estimating expected values of functions of random variables. Monte Carlo simulation approximates expected values by generating random samples from the variable's distribution and taking the sample mean. Various techniques are discussed for simulating random variables from different distributions and generating random vectors with specified joint distributions. The goal is to simulate random variables and vectors that can be used to estimate expectations through Monte Carlo simulation.

Uploaded by

VS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views

5C RV Simulation New

The document discusses Monte Carlo simulation techniques for estimating expected values of functions of random variables. Monte Carlo simulation approximates expected values by generating random samples from the variable's distribution and taking the sample mean. Various techniques are discussed for simulating random variables from different distributions and generating random vectors with specified joint distributions. The goal is to simulate random variables and vectors that can be used to estimate expectations through Monte Carlo simulation.

Uploaded by

VS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 189

Simulation of an Random Variable

Monte Carlo Simulation

Vineet Sahula

Professor, Department of ECE


Malaviya National Institute of Technology Jaipur
vsahula.ece @ mnit.ac.in

ECT-524 Modeling, Optimization & Machine Intelligence (2022)

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 1 / 72


Todayś Lecture: December 01, 2021

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 2 / 72


Outline
Schedule- 8:00 – 9:45 every Wednesday
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
• Refer course site for details
• (MNIT Jaipur)
https://canvas.instructure.com/
V. Sahula Simulation of an Random Variable Modeling & Optimization 3 / 72
Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 4 / 72
Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 5 / 72
Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Simulating an RV

Estimating E[g(X)]
• Let X = (X1 , X2 , . . . , Xn ) denote a random vector
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of g (X )
 
• E [g (X )] = . . . g (x1 , x2 , . . . xn ) f (x1 , x2 , . . . , xn ) dx1 dx2 , . . . , dxn
• Analytical OR numerical integration is tedious
• Alternately, approximate E [g (X)] by means of SIMULATION

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 6 / 72


Approximating E [g (X)]

Monte Carlo Simulation


• Generate a random RV,
 
• X(1) = X(11) , X(21) , · · · , X(n1) having density function f (x1 , x2 . . . , xn )
• compute Y (1) = g X(1)


• Generate second RV,


 
• X(2) = X(12) , X(22) , · · · , X(n2)
• compute Y (2) = g X(2)


• Repeat this r number of times, generating i.i.d. RVs


• Y (i ) = g X(i ) , i = 1, 2, . . . , r are generated


• ⇒ lim lim Y (1) +Y (2) +···+Y (r ) = E Y (i ) = E [g (X)]


 
r →∞ r

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 7 / 72


Approximating E [g (X)]

Monte Carlo Simulation


• Generate a random RV,
 
• X(1) = X(11) , X(21) , · · · , X(n1) having density function f (x1 , x2 . . . , xn )
• compute Y (1) = g X(1)


• Generate second RV,


 
• X(2) = X(12) , X(22) , · · · , X(n2)
• compute Y (2) = g X(2)


• Repeat this r number of times, generating i.i.d. RVs


• Y (i ) = g X(i ) , i = 1, 2, . . . , r are generated


• ⇒ lim lim Y (1) +Y (2) +···+Y (r ) = E Y (i ) = E [g (X)]


 
r →∞ r

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 7 / 72


Approximating E [g (X)]

Monte Carlo Simulation


• Generate a random RV,
 
• X(1) = X(11) , X(21) , · · · , X(n1) having density function f (x1 , x2 . . . , xn )
• compute Y (1) = g X(1)


• Generate second RV,


 
• X(2) = X(12) , X(22) , · · · , X(n2)
• compute Y (2) = g X(2)


• Repeat this r number of times, generating i.i.d. RVs


• Y (i ) = g X(i ) , i = 1, 2, . . . , r are generated


• ⇒ lim lim Y (1) +Y (2) +···+Y (r ) = E Y (i ) = E [g (X)]


 
r →∞ r

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 7 / 72


How to generate a Random Vectors, X(r )

Generating random vectors X having specified joint distribution?


• Objective-
• Generate a sequence of random vectors X(r ) having a particular
distribution,
• As first step, we need to be able to generate random variables from
uniform distribution (0,1)

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 8 / 72


How to generate a Random Vectors, X(r )

Generating random vectors X having specified joint distribution?


• Objective-
• Generate a sequence of random vectors X(r ) having a particular
distribution,
• As first step, we need to be able to generate random variables from
uniform distribution (0,1)

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 8 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), MANUALLY


• 10 identical slips, containing numbers 0..9
• Successively select, n chits with replacement
• Equivalent to generating a string of n digits with decimal in front,
thus
can be regarded as a value of a uniform (0,1) RV rounded off to
1 n

nearest 10

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 9 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), MANUALLY


• 10 identical slips, containing numbers 0..9
• Successively select, n chits with replacement
• Equivalent to generating a string of n digits with decimal in front,
thus
can be regarded as a value of a uniform (0,1) RV rounded off to
1 n

nearest 10

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 9 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), MANUALLY


• 10 identical slips, containing numbers 0..9
• Successively select, n chits with replacement
• Equivalent to generating a string of n digits with decimal in front,
thus
can be regarded as a value of a uniform (0,1) RV rounded off to
1 n

nearest 10

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 9 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), MANUALLY


• 10 identical slips, containing numbers 0..9
• Successively select, n chits with replacement
• Equivalent to generating a string of n digits with decimal in front,
thus
can be regarded as a value of a uniform (0,1) RV rounded off to
1 n

nearest 10

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 9 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), on COMPUTER


• Pseudo random instead of truly random
• Starts with initial value X0 called seed
• Then, recursively compute values using a, c and m
• Xn+1 = (aXn + c ) modulo m
• Thus, each Xn is a number 0, 1, . . . (m − 1)
• Xmn is taken as an approximation to a UNIFORM (0,1) RV

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 10 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), on COMPUTER


• Pseudo random instead of truly random
• Starts with initial value X0 called seed
• Then, recursively compute values using a, c and m
• Xn+1 = (aXn + c ) modulo m
• Thus, each Xn is a number 0, 1, . . . (m − 1)
• Xmn is taken as an approximation to a UNIFORM (0,1) RV

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 10 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), on COMPUTER


• Pseudo random instead of truly random
• Starts with initial value X0 called seed
• Then, recursively compute values using a, c and m
• Xn+1 = (aXn + c ) modulo m
• Thus, each Xn is a number 0, 1, . . . (m − 1)
• Xmn is taken as an approximation to a UNIFORM (0,1) RV

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 10 / 72


Generating random variable on UNIFORM (0,1)

Generating a Uniform RV on (0,1), on COMPUTER


• Pseudo random instead of truly random
• Starts with initial value X0 called seed
• Then, recursively compute values using a, c and m
• Xn+1 = (aXn + c ) modulo m
• Thus, each Xn is a number 0, 1, . . . (m − 1)
• Xmn is taken as an approximation to a UNIFORM (0,1) RV

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 10 / 72


Todayś Lecture: 12 October 2022

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 11 / 72


Estimating the number of distinct entries in a large list

# of distinct entries
• Consider list of n entries, where n is very large
• Objective is to compute d, # of distinct entries
• Let, mi be number of times element in position i appears on the list
Xn
• Then, d = 1/mi
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 12 / 72


Estimating the number of distinct entries in a large list

# of distinct entries
• Consider list of n entries, where n is very large
• Objective is to compute d, # of distinct entries
• Let, mi be number of times element in position i appears on the list
Xn
• Then, d = 1/mi
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 12 / 72


Estimating the number of distinct entries in a large list

# of distinct entries
• Consider list of n entries, where n is very large
• Objective is to compute d, # of distinct entries
• Let, mi be number of times element in position i appears on the list
Xn
• Then, d = 1/mi
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 12 / 72


Estimating the number of distinct entries in a large list

# of distinct entries
• Consider list of n entries, where n is very large
• Objective is to compute d, # of distinct entries
• Let, mi be number of times element in position i appears on the list
Xn
• Then, d = 1/mi
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 12 / 72


Estimating the number of distinct entries ...

# of distinct entries
• To estimate d, lets generate a random value
X = {1, 2, . . . n} taking X = ⌊nU ⌋ + 1
• Let m(X ) denote the number of times, the element in position X
appears in the list
X
n
1 1
• E [1/m(X )] =
mi n
i =1
• If we generate k such random variables X1 , X2 , . . . Xk , the estimate is
X
k
1/m(X )
i

• d= i =1
k

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 13 / 72


Estimating the number of distinct entries ...

# of distinct entries
• To estimate d, lets generate a random value
X = {1, 2, . . . n} taking X = ⌊nU ⌋ + 1
• Let m(X ) denote the number of times, the element in position X
appears in the list
X
n
1 1
• E [1/m(X )] =
mi n
i =1
• If we generate k such random variables X1 , X2 , . . . Xk , the estimate is
X
k
1/m(X )
i

• d= i =1
k

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 13 / 72


Estimating the number of distinct entries ...

# of distinct entries
• To estimate d, lets generate a random value
X = {1, 2, . . . n} taking X = ⌊nU ⌋ + 1
• Let m(X ) denote the number of times, the element in position X
appears in the list
X
n
1 1
• E [1/m(X )] =
mi n
i =1
• If we generate k such random variables X1 , X2 , . . . Xk , the estimate is
X
k
1/m(X )
i

• d= i =1
k

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 13 / 72


Estimating the number of distinct entries ...

# of distinct entries
• To estimate d, lets generate a random value
X = {1, 2, . . . n} taking X = ⌊nU ⌋ + 1
• Let m(X ) denote the number of times, the element in position X
appears in the list
X
n
1 1
• E [1/m(X )] =
mi n
i =1
• If we generate k such random variables X1 , X2 , . . . Xk , the estimate is
X
k
1/m(X )
i

• d= i =1
k

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 13 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Estimating the SUM of values of distinct entries in a large
list

SUM of distinct values


• Suppose each i th item has a value atached to it- v (i )
• Sum of values of distinct items, v , can be expressed as
X
n
• v= v (i )/m(i )
i =1
• For X = ⌊nU ⌋ + 1 , U being random number (0,1)
h i X n
v (i ) 1
• E v (X ) = = vn
m (X ) m (i ) n
i =1
• Hence, v can be estimates by generating X1 , X2 , . . . Xk
X
k
v (Xi )
• v= n
k m(Xi )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 14 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Example: Estimating the SUM ...

SUM of distinct values


• Let there be s events,
• Ai = {ai,1 , . . . , ai,n }, with i = 1, ..., s
• Objctive is to estiamte
s
!
[
• P Ai
i =1
s X XX
!
[
• P Ai = P (a ) = P (ai,j )/m(ai,j )
i =1 a∈Ai
• here, m(ai,j ) is number of events to which the point ai,j belongs

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 15 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 16 / 72
Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 17 / 72
Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Discrete RVs

Discrete RVs distributions, X


• Uniform
• p (i ) = 1 for ∀i ∈ {0 . . . n}
n
• Bernoulli
• p (success ) = p; p (failure ) = 1 − p
• Binomial
• p (i ) = Cin p i (1 − p )n−i
• Geometric
• p (n) = P {X = n} = (1 − p )n−1 p
• Poisson
• p (i ) = P {X = i } = e −λ λi
i!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 18 / 72


Continuous RVs

Continuous RVs distributions


1 0<x ⩽1
• Uniform, f (x ) =
0 otherwise
(x −µ)2/2σ2
• Gaussian, f (x ) = √1 e −
σ 2π
λe −λx x ⩾ 0
• Exponential, f (x ) =
0 x <0

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 19 / 72


Continuous RVs

Continuous RVs distributions


1 0<x ⩽1
• Uniform, f (x ) =
0 otherwise
(x −µ)2/2σ2
• Gaussian, f (x ) = √1 e −
σ 2π
λe −λx x ⩾ 0
• Exponential, f (x ) =
0 x <0

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 19 / 72


Continuous RVs

Continuous RVs distributions


1 0<x ⩽1
• Uniform, f (x ) =
0 otherwise
(x −µ)2/2σ2
• Gaussian, f (x ) = √1 e −
σ 2π
λe −λx x ⩾ 0
• Exponential, f (x ) =
0 x <0

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 19 / 72


Continuous RVs

Continuous RVs distributions


1 0<x ⩽1
• Uniform, f (x ) =
0 otherwise
(x −µ)2/2σ2
• Gaussian, f (x ) = √1 e −
σ 2π
λe −λx x ⩾ 0
• Exponential, f (x ) =
0 x <0

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 19 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 20 / 72
Inverse transform method

Proposition for inverse transform method


• Let U be a UNIFORM (0,1) random variable
• If we define a random variable X with continuous distribution
function F , then
• X = F −1 (U )
• F −1 (u )is defined as to equal that value of x for which F (x ) = u

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 21 / 72


Inverse transform method

Proposition for inverse transform method


• Let U be a UNIFORM (0,1) random variable
• If we define a random variable X with continuous distribution
function F , then
• X = F −1 (U )
• F −1 (u )is defined as to equal that value of x for which F (x ) = u

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 21 / 72


Inverse transform method

Proposition for inverse transform method


• Let U be a UNIFORM (0,1) random variable
• If we define a random variable X with continuous distribution
function F , then
• X = F −1 (U )
• F −1 (u )is defined as to equal that value of x for which F (x ) = u

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 21 / 72


Inverse transform method- Proof

Proof for inverse transform method


FX (a) = P {X ⩽ a} = P F −1 (U ) ⩽ a

• Now, F (x ) is a monotone function, it follows that F −1 (U ) ⩽ a, iff


U ⩽ F (a )
• FX (a) = P {U ⩽ F (a)} == F (a)
• Hence, we can simulate a RV X from continuos distribution F , When
F −1 is computable
• by simulating a RV U, and then setting X = F −1 (U )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 22 / 72


Inverse transform method- Proof

Proof for inverse transform method


FX (a) = P {X ⩽ a} = P F −1 (U ) ⩽ a

• Now, F (x ) is a monotone function, it follows that F −1 (U ) ⩽ a, iff


U ⩽ F (a )
• FX (a) = P {U ⩽ F (a)} == F (a)
• Hence, we can simulate a RV X from continuos distribution F , When
F −1 is computable
• by simulating a RV U, and then setting X = F −1 (U )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 22 / 72


Inverse transform method- Proof

Proof for inverse transform method


FX (a) = P {X ⩽ a} = P F −1 (U ) ⩽ a

• Now, F (x ) is a monotone function, it follows that F −1 (U ) ⩽ a, iff


U ⩽ F (a )
• FX (a) = P {U ⩽ F (a)} == F (a)
• Hence, we can simulate a RV X from continuos distribution F , When
F −1 is computable
• by simulating a RV U, and then setting X = F −1 (U )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 22 / 72


Simulating an EXPONENTIAL RV

F(x)
F (x ) = 1 − e −x
• For F (x ) = 1 − e −x ,
• then F −1 (u ) is that value of x such that F (x ) = u , i.e.

1 − e −x = u
x = − log (1 − u )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 23 / 72


Simulating an EXPONENTIAL RV ...

F(x)
• Hence, if U is UNIFORM (0,1) variable, then following is a random
variable EXPONENTIALLY distributed

F −1 (U ) = − log(1 − U )

• As 1 − U is also UNIFORM on (0,1)


• − log U is EXPONENTIAL with mean 1
• −c log U is EXPONENTIAL with mean c, as cX has mean c if X has
mean 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 24 / 72


Simulating an EXPONENTIAL RV ...

F(x)
• Hence, if U is UNIFORM (0,1) variable, then following is a random
variable EXPONENTIALLY distributed

F −1 (U ) = − log(1 − U )

• As 1 − U is also UNIFORM on (0,1)


• − log U is EXPONENTIAL with mean 1
• −c log U is EXPONENTIAL with mean c, as cX has mean c if X has
mean 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 24 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 25 / 72
The Rejection Method

Simulating from distribution f(x), knowing method for g(x)


• Suppose we have method for simulating an RV having density
function g (x )
• Using this basis, let’s simulate from f (x )
• First, we simulate Y from g
• Then, ACCEPT this siulated value with a probability proportional to
f (Y )
g (Y )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 26 / 72


The Rejection Method

Simulating from distribution f(x), knowing method for g(x)


• Suppose we have method for simulating an RV having density
function g (x )
• Using this basis, let’s simulate from f (x )
• First, we simulate Y from g
• Then, ACCEPT this siulated value with a probability proportional to
f (Y )
g (Y )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 26 / 72


The Rejection Method

Simulating from distribution f(x), knowing method for g(x)


• Suppose we have method for simulating an RV having density
function g (x )
• Using this basis, let’s simulate from f (x )
• First, we simulate Y from g
• Then, ACCEPT this siulated value with a probability proportional to
f (Y )
g (Y )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 26 / 72


The Rejection Method

Simulating from distribution f(x), knowing method for g(x)


• Suppose we have method for simulating an RV having density
function g (x )
• Using this basis, let’s simulate from f (x )
• First, we simulate Y from g
• Then, ACCEPT this siulated value with a probability proportional to
f (Y )
g (Y )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 26 / 72


The Rejection Method ...

Technique for simulating from distribution f(x)


• If
f (y )
⩽c for all y
g (y )

Step_1 Simulate Y having density g (Y )


Step_2 Simulate U
f (Y )
Step_3 if U ⩽ cg (Y ) , set X = Y ; Otherwise RETURN to Step_1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 27 / 72


The Rejection Method ...

Technique for simulating from distribution f(x)


• If
f (y )
⩽c for all y
g (y )

Step_1 Simulate Y having density g (Y )


Step_2 Simulate U
f (Y )
Step_3 if U ⩽ cg (Y ) , set X = Y ; Otherwise RETURN to Step_1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 27 / 72


The Rejection Method ...

Technique for simulating from distribution f(x)


• If
f (y )
⩽c for all y
g (y )

Step_1 Simulate Y having density g (Y )


Step_2 Simulate U
f (Y )
Step_3 if U ⩽ cg (Y ) , set X = Y ; Otherwise RETURN to Step_1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 27 / 72


The Rejection Method ...

Technique for simulating from distribution f(x)


• If
f (y )
⩽c for all y
g (y )

Step_1 Simulate Y having density g (Y )


Step_2 Simulate U
f (Y )
Step_3 if U ⩽ cg (Y ) , set X = Y ; Otherwise RETURN to Step_1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 27 / 72


The Rejection Method ...

Technique for simulating from distribution f(x)


• If
f (y )
⩽c for all y
g (y )

Step_1 Simulate Y having density g (Y )


Step_2 Simulate U
f (Y )
Step_3 if U ⩽ cg (Y ) , set X = Y ; Otherwise RETURN to Step_1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 27 / 72


Simulating a Normal RV

Simulating from Normal distribution


• A normal RV Z has mean (µ)= 0 and variance (σ2 )=1; absolute
value of Z will have density-
2 x2
f (x ) = √ e − 2

• Given that X with g (x ) = e −x is available
)2
q q
− x −21
(
• gf ((xx )) = 2e
π e ⩽ 2e
π

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 28 / 72


Simulating a Normal RV

Simulating from Normal distribution


• A normal RV Z has mean (µ)= 0 and variance (σ2 )=1; absolute
value of Z will have density-
2 x2
f (x ) = √ e − 2

• Given that X with g (x ) = e −x is available
)2
q q
− x −21
(
• gf ((xx )) = 2e
π e ⩽ 2e
π

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 28 / 72


Simulating a Normal RV

Simulating from Normal distribution


• A normal RV Z has mean (µ)= 0 and variance (σ2 )=1; absolute
value of Z will have density-
2 x2
f (x ) = √ e − 2

• Given that X with g (x ) = e −x is available
)2
q q
− x −21
(
• gf ((xx )) = 2e
π e ⩽ 2e
π

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 28 / 72


Simulating a Normal RV ...

Simulating from Normal distribution- procedure


1 Generate independent random variables Y and U, Y being
exponential with rate 1 and U being uniform on (0,1)
(x −1)2 (Y −1)2
2 If U ⩽ e − 2 , or equivalantly, if − log U ⩾ 2 ; set X = Y else
return to 1.
• Having X , we can generate Z , by letting Z be equally likely to be
either X or −X

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 29 / 72


Simulating a Normal RV ...

Simulating from Normal distribution- procedure


1 Generate independent random variables Y and U, Y being
exponential with rate 1 and U being uniform on (0,1)
(x −1)2 (Y −1)2
2 If U ⩽ e − 2 , or equivalantly, if − log U ⩾ 2 ; set X = Y else
return to 1.
• Having X , we can generate Z , by letting Z be equally likely to be
either X or −X

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 29 / 72


Simulating a Normal RV ...

Simulating from Normal distribution- procedure


1 Generate independent random variables Y and U, Y being
exponential with rate 1 and U being uniform on (0,1)
(x −1)2 (Y −1)2
2 If U ⩽ e − 2 , or equivalantly, if − log U ⩾ 2 ; set X = Y else
return to 1.
• Having X , we can generate Z , by letting Z be equally likely to be
either X or −X

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 29 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Algorithm: Simulating a Normal RV

Algorithm for Normal RV


1 Generate Y1i , an exponential random variable with rate 1.
2 Generate Y2 , an exponential with rate 1.
2 2
3 If Y2 − (Y1 −1) /2 > 0, set Y = Y2 − (Y1 −1) /2 and go to Step 4.
Otherwise go to Step 1.
4 Generate a random number U and set

1
Y1 U⩽ 2
Z= 1
−Y1 U> 2

• Random variables Z and Y generated by above method are


independent with
• Z being Normal N (0, 1) and
• Y being Exponential with rate 1.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 30 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 31 / 72
Hazard rate method

Simulating an RV, S having hazard rate


λ(t )
• Let F be a CDF with Ψ F (0) = 1
• let λ be hazard rate, given by λ(t ) = f (t )
Ψ , t>0
F (t )
• That is, λ represents the instantaneous intensity
• that an item having life distribution F will fail at time t given it has
survived to that time

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 32 / 72


Hazard rate method

Simulating an RV, S having hazard rate


λ(t )
• Let F be a CDF with Ψ F (0) = 1
• let λ be hazard rate, given by λ(t ) = f (t )
Ψ , t>0
F (t )
• That is, λ represents the instantaneous intensity
• that an item having life distribution F will fail at time t given it has
survived to that time

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 32 / 72


Hazard rate method

Simulating an RV, S having hazard rate


λ(t )
• Let F be a CDF with Ψ F (0) = 1
• let λ be hazard rate, given by λ(t ) = f (t )
Ψ , t>0
F (t )
• That is, λ represents the instantaneous intensity
• that an item having life distribution F will fail at time t given it has
survived to that time

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 32 / 72


Hazard rate method ...

Simulating an RV, S having hazard rate


λ(t )
• Let’s simulate S having hazard rate function λ(t )
• To do so, let λ be such that

• λ(t ) ⩽ λ for all t ⩾ 0 & ∞ λ(t )dt = ∞
0
• Then, to simulate from λ(t ) we will
• simulate a Poisson process, with rate λ &
• ’count’ or ’accept’ ONLY certain of these Poisson events
• COUNT an event that occurs at time t, independently of all else, with
λ(t )
probability λ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 33 / 72


Hazard rate method ...

Simulating an RV, S having hazard rate


λ(t )
• Let’s simulate S having hazard rate function λ(t )
• To do so, let λ be such that

• λ(t ) ⩽ λ for all t ⩾ 0 & ∞ λ(t )dt = ∞
0
• Then, to simulate from λ(t ) we will
• simulate a Poisson process, with rate λ &
• ’count’ or ’accept’ ONLY certain of these Poisson events
• COUNT an event that occurs at time t, independently of all else, with
λ(t )
probability λ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 33 / 72


Hazard rate method ...

Simulating an RV, S having hazard rate


λ(t )
• Let’s simulate S having hazard rate function λ(t )
• To do so, let λ be such that

• λ(t ) ⩽ λ for all t ⩾ 0 & ∞ λ(t )dt = ∞
0
• Then, to simulate from λ(t ) we will
• simulate a Poisson process, with rate λ &
• ’count’ or ’accept’ ONLY certain of these Poisson events
• COUNT an event that occurs at time t, independently of all else, with
λ(t )
probability λ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 33 / 72


Hazard rate method ...

Simulating an RV, S having hazard rate


λ(t )
• Let’s simulate S having hazard rate function λ(t )
• To do so, let λ be such that

• λ(t ) ⩽ λ for all t ⩾ 0 & ∞ λ(t )dt = ∞
0
• Then, to simulate from λ(t ) we will
• simulate a Poisson process, with rate λ &
• ’count’ or ’accept’ ONLY certain of these Poisson events
• COUNT an event that occurs at time t, independently of all else, with
λ(t )
probability λ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 33 / 72


Hazard rate method: Technique

Generating S having hazard rate


S : λS (t ) = λ(t )
• Let λ(t ) ⩽ λ for all t ⩾ 0
• Generate pairs of RVs, Ui , Xi for i ⩾ 1; Where Xi being exponential
with rate λ & Ui being uniform (0,1)
• STOP at  !

n X 

 



λ Xi 

i =1
N = min n : Un ⩽

 λ 


 

 
PN
• Set S = i =1 Xi

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 34 / 72


Hazard rate method: Technique

Generating S having hazard rate


S : λS (t ) = λ(t )
• Let λ(t ) ⩽ λ for all t ⩾ 0
• Generate pairs of RVs, Ui , Xi for i ⩾ 1; Where Xi being exponential
with rate λ & Ui being uniform (0,1)
• STOP at  !

n X 

 



λ Xi 

i =1
N = min n : Un ⩽

 λ 


 

 
PN
• Set S = i =1 Xi

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 34 / 72


Hazard rate method: Technique

Generating S having hazard rate


S : λS (t ) = λ(t )
• Let λ(t ) ⩽ λ for all t ⩾ 0
• Generate pairs of RVs, Ui , Xi for i ⩾ 1; Where Xi being exponential
with rate λ & Ui being uniform (0,1)
• STOP at  !

n X 

 



λ Xi 

i =1
N = min n : Un ⩽

 λ 


 

 
PN
• Set S = i =1 Xi

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 34 / 72


Hazard rate method: Technique

Generating S having hazard rate


S : λS (t ) = λ(t )
• Let λ(t ) ⩽ λ for all t ⩾ 0
• Generate pairs of RVs, Ui , Xi for i ⩾ 1; Where Xi being exponential
with rate λ & Ui being uniform (0,1)
• STOP at  !

n X 

 



λ Xi 

i =1
N = min n : Un ⩽

 λ 


 

 
PN
• Set S = i =1 Xi

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 34 / 72


[Hazard rate method:] Definition

• To compute E [N ] we need the result, which states that if X1 , X2 , . . .,


are i.i.d. RVs that are observed in sequence up to some N then,

X
" N #
E Xi = E [N ]E [X ]
i =1

Stopping Time

Definition
An integer RV N, is said to be stopping time for the sequence X1 , X2 , . . .,
if the event {N = n} is independent of Xn+1 , Xn+2 , . . . for all n = 1, 2, ...

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 35 / 72


[Hazard rate method:] Definition

• To compute E [N ] we need the result, which states that if X1 , X2 , . . .,


are i.i.d. RVs that are observed in sequence up to some N then,

X
" N #
E Xi = E [N ]E [X ]
i =1

Stopping Time

Definition
An integer RV N, is said to be stopping time for the sequence X1 , X2 , . . .,
if the event {N = n} is independent of Xn+1 , Xn+2 , . . . for all n = 1, 2, ...

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 35 / 72


[Hazard rate method:] Definition

• To compute E [N ] we need the result, which states that if X1 , X2 , . . .,


are i.i.d. RVs that are observed in sequence up to some N then,

X
" N #
E Xi = E [N ]E [X ]
i =1

Stopping Time

Definition
An integer RV N, is said to be stopping time for the sequence X1 , X2 , . . .,
if the event {N = n} is independent of Xn+1 , Xn+2 , . . . for all n = 1, 2, ...

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 35 / 72


[Hazard rate method:] Definition

• To compute E [N ] we need the result, which states that if X1 , X2 , . . .,


are i.i.d. RVs that are observed in sequence up to some N then,

X
" N #
E Xi = E [N ]E [X ]
i =1

Stopping Time

Definition
An integer RV N, is said to be stopping time for the sequence X1 , X2 , . . .,
if the event {N = n} is independent of Xn+1 , Xn+2 , . . . for all n = 1, 2, ...

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 35 / 72


[Hazard rate method:] Example

No. of finite successes


• Let Xn , n = 1, 2, ... be independent and such that
P {Xn = 0} = P {Xn = 1} = 12 , n = 1, 2, ...
• If we let N = min {n : X1 + · · · + Xn = 10}
• Then, N is a stopping time.
• e.g. N may be regarded as being the stopping time of EXPERIMENT
that successively flips a fair coin and then stops when the number of
HEADs reaches 10.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 36 / 72


[Hazard rate method:] Example

No. of finite successes


• Let Xn , n = 1, 2, ... be independent and such that
P {Xn = 0} = P {Xn = 1} = 12 , n = 1, 2, ...
• If we let N = min {n : X1 + · · · + Xn = 10}
• Then, N is a stopping time.
• e.g. N may be regarded as being the stopping time of EXPERIMENT
that successively flips a fair coin and then stops when the number of
HEADs reaches 10.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 36 / 72


[Hazard rate method:] Example

No. of finite successes


• Let Xn , n = 1, 2, ... be independent and such that
P {Xn = 0} = P {Xn = 1} = 12 , n = 1, 2, ...
• If we let N = min {n : X1 + · · · + Xn = 10}
• Then, N is a stopping time.
• e.g. N may be regarded as being the stopping time of EXPERIMENT
that successively flips a fair coin and then stops when the number of
HEADs reaches 10.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 36 / 72


[Hazard rate method:] Example

No. of finite successes


• Let Xn , n = 1, 2, ... be independent and such that
P {Xn = 0} = P {Xn = 1} = 12 , n = 1, 2, ...
• If we let N = min {n : X1 + · · · + Xn = 10}
• Then, N is a stopping time.
• e.g. N may be regarded as being the stopping time of EXPERIMENT
that successively flips a fair coin and then stops when the number of
HEADs reaches 10.

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 36 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 37 / 72
Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Special Techniques- CRVs

2 2
• Normal distribution, f (x , y ) = 21π e −(x +y )/2
• Cauchy distribution, f (x ) = 1
π(1+x 2 )
−∞<x <∞
• Gamma distribution ?
• Chi-squared distribution, χ2n = Z12 + Z22 + · · · + Zn2 (n-dimensional
Chi-squared distribution)
• here, Zi , i − 1, 2, ..., n are independent unit Normals
• Beta (n, m) distribution,
f (x ) = (n(−n1+)m −1)!
!(m−1)! x
n−1 (1 − x )m−1 with 0 < x < 1

• Exponential- the Von-Neumann algorithm ?

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 38 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 39 / 72
Simulating from discrete distributions

Simulating X having probability mass function Pj



• Let X be a discrete RV i.e. P X = xj = Pj for all j = 0, 1 . . ., where
X
Pj = 1
j
• Inverse transform analogue; Let U be uniform (0,1)

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 40 / 72


Simulating from discrete distributions

Simulating X having probability mass function Pj



• Let X be a discrete RV i.e. P X = xj = Pj for all j = 0, 1 . . ., where
X
Pj = 1
j
• Inverse transform analogue; Let U be uniform (0,1)

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 40 / 72


Simulating from discrete distributions ...

Simulating X having probability mass function Pj


• X can be set as following
• 

 x1 if U < P1



 x2 if P1 < U < P1 + P2



 ..
 .
X= X
j −1 X
j



 xj if Pi < U < Pi



 1 1


 . ..

• As
•  j −1 
X X
j
P {X = xj } = P Pi < U < Pi = Pj
1 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 41 / 72


Simulating from discrete distributions ...

Simulating X having probability mass function Pj


• X can be set as following
• 

 x1 if U < P1



 x2 if P1 < U < P1 + P2



 ..
 .
X= X
j −1 X
j



 xj if Pi < U < Pi



 1 1


 . ..

• As
•  j −1 
X X
j
P {X = xj } = P Pi < U < Pi = Pj
1 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 41 / 72


Simulating from discrete distributions ...

Simulating X having probability mass function Pj


• X can be set as following
• 

 x1 if U < P1



 x2 if P1 < U < P1 + P2



 ..
 .
X= X
j −1 X
j



 xj if Pi < U < Pi



 1 1


 . ..

• As
•  j −1 
X X
j
P {X = xj } = P Pi < U < Pi = Pj
1 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 41 / 72


Simulating from discrete distributions ...

Simulating X having probability mass function Pj


• X can be set as following
• 

 x1 if U < P1



 x2 if P1 < U < P1 + P2



 ..
 .
X= X
j −1 X
j



 xj if Pi < U < Pi



 1 1


 . ..

• As
•  j −1 
X X
j
P {X = xj } = P Pi < U < Pi = Pj
1 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 41 / 72


Simulating from discrete distributions ...

Simulating X having probability mass function Pj


• X can be set as following
• 

 x1 if U < P1



 x2 if P1 < U < P1 + P2



 ..
 .
X= X
j −1 X
j



 xj if Pi < U < Pi



 1 1


 . ..

• As
•  j −1 
X X
j
P {X = xj } = P Pi < U < Pi = Pj
1 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 41 / 72


Simulating from Geometric distribution

Simulating X having GEOMETRIC probability mass function


• Let’s simulate X having Geometric pmf , i.e. P {X = i } = p (1 − p )i −1
X
j −1
• P {X = i } = 1 − P {X > j − 1} = 1 − (1 − p )i −1
i =1
• We start with generating U, and then setting X to that value of j for
which

1 − (1 − p )i −1 < U < 1 − (1 − p )i
(1 − p )j −1 > 1 − U > (1 − p )i

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 42 / 72


Simulating from Geometric distribution

Simulating X having GEOMETRIC probability mass function


• Let’s simulate X having Geometric pmf , i.e. P {X = i } = p (1 − p )i −1
X
j −1
• P {X = i } = 1 − P {X > j − 1} = 1 − (1 − p )i −1
i =1
• We start with generating U, and then setting X to that value of j for
which

1 − (1 − p )i −1 < U < 1 − (1 − p )i
(1 − p )j −1 > 1 − U > (1 − p )i

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 42 / 72


Simulating from Geometric distribution

Simulating X having GEOMETRIC probability mass function


• Let’s simulate X having Geometric pmf , i.e. P {X = i } = p (1 − p )i −1
X
j −1
• P {X = i } = 1 − P {X > j − 1} = 1 − (1 − p )i −1
i =1
• We start with generating U, and then setting X to that value of j for
which

1 − (1 − p )i −1 < U < 1 − (1 − p )i
(1 − p )j −1 > 1 − U > (1 − p )i

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 42 / 72


Simulating from Geometric distribution ...

Simulating X having GEOMETRIC probability mass function


• As U has same distribution as 1 − U, X can also be defined as

log U
X = min j : (1 − p )j < U = min j : j >
log (1 − p )
log U
=1+
log (1 − p )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 43 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 44 / 72
Simulating a Binomial RV

Binomial discrete RV
• A binomial RV (n, p ) can be most easily simulated considering that
• it can be expressed as the sum of n independent Bernoulli RV
• e.g., if U1 , U2 , . . . Un are independent Uniform (0,1) RVs then letting

1 if Ui < p
Xi =
0 otherwise

X
n
• then X = Xi is a binomial RV with (n, p )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 45 / 72


Simulating a Binomial RV

Binomial discrete RV
• A binomial RV (n, p ) can be most easily simulated considering that
• it can be expressed as the sum of n independent Bernoulli RV
• e.g., if U1 , U2 , . . . Un are independent Uniform (0,1) RVs then letting

1 if Ui < p
Xi =
0 otherwise

X
n
• then X = Xi is a binomial RV with (n, p )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 45 / 72


Simulating a Binomial RV

Binomial discrete RV
• A binomial RV (n, p ) can be most easily simulated considering that
• it can be expressed as the sum of n independent Bernoulli RV
• e.g., if U1 , U2 , . . . Un are independent Uniform (0,1) RVs then letting

1 if Ui < p
Xi =
0 otherwise

X
n
• then X = Xi is a binomial RV with (n, p )
i =1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 45 / 72


Simulating a Binomial RV: Procedure

Procedure
Improvements Step_1 Let α = 1/p and
• The previous method requires β = 1/(1 − p )
generation of n random numbers Step_2 Set k = 0
• Instead of using value of U, the Step_3 Generate a uniform RV U
previous algorithm uses the fact Step_4 If k = n stop, else RESET
Ui < p or not k =k +1
• Conditional distribution of U given Step_5 If U ⩽ p set Xk = 1 and
that U < p is uniform in (0, p ) & RESET U to αU. If U > p
conditional distribution of U given set Xk = 0 and RESET U
that U > p is uniform in (p, 1) to β(U − p ). RETURN to
Step_4

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 46 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 47 / 72
Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 48 / 72
Alias Method

Alias Method
• Let quantities, P, P(k ) , Q(k ) , k ⩽ n − 1 represent pmf on 1, 2, . . . , n
i.e. they are n-vectors of non-negative numbers summing to 1
• Additionally, each of P(k ) will have at most k non-zero components
• Each of Q(k ) will have at most 2 non-zero components

Lemma
Let P = {Pi , i = 1, 2, . . . , n} denote pmf , then
1
(a) there exists and i, 1 ⩽ i ⩽ n, such that Pi < n− 1 , and
1
(b) for this i, there exists a j, j ̸= i, such that Pi + Pj ⩾ n− 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 49 / 72


Alias Method

Alias Method
• Let quantities, P, P(k ) , Q(k ) , k ⩽ n − 1 represent pmf on 1, 2, . . . , n
i.e. they are n-vectors of non-negative numbers summing to 1
• Additionally, each of P(k ) will have at most k non-zero components
• Each of Q(k ) will have at most 2 non-zero components

Lemma
Let P = {Pi , i = 1, 2, . . . , n} denote pmf , then
1
(a) there exists and i, 1 ⩽ i ⩽ n, such that Pi < n− 1 , and
1
(b) for this i, there exists a j, j ̸= i, such that Pi + Pj ⩾ n− 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 49 / 72


Alias Method

Alias Method
• Let quantities, P, P(k ) , Q(k ) , k ⩽ n − 1 represent pmf on 1, 2, . . . , n
i.e. they are n-vectors of non-negative numbers summing to 1
• Additionally, each of P(k ) will have at most k non-zero components
• Each of Q(k ) will have at most 2 non-zero components

Lemma
Let P = {Pi , i = 1, 2, . . . , n} denote pmf , then
1
(a) there exists and i, 1 ⩽ i ⩽ n, such that Pi < n− 1 , and
1
(b) for this i, there exists a j, j ̸= i, such that Pi + Pj ⩾ n− 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 49 / 72


Alias Method

Alias Method
• Let quantities, P, P(k ) , Q(k ) , k ⩽ n − 1 represent pmf on 1, 2, . . . , n
i.e. they are n-vectors of non-negative numbers summing to 1
• Additionally, each of P(k ) will have at most k non-zero components
• Each of Q(k ) will have at most 2 non-zero components

Lemma
Let P = {Pi , i = 1, 2, . . . , n} denote pmf , then
1
(a) there exists and i, 1 ⩽ i ⩽ n, such that Pi < n− 1 , and
1
(b) for this i, there exists a j, j ̸= i, such that Pi + Pj ⩾ n− 1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 49 / 72


Alias Method ...

Alias Method ...


• Any pmf P can be represented as an equally weighted mixture of
n − 1 pmf Q, i.e. for suitably defined Q (1) , Q (2) , . . . Q (n−1)

1 X (k )
n−1
P= Q
n−1
k =1

Lemma
Let P = {Pi , i = 1, 2, . . . n} denote a probability mass function, then
1 There exists an i, 1 ⩽ i ⩽ n, such that Pi < 1/(n − 1), and
2 for this i there exists a j, j ̸= i, such that Pi + Pj ⩾ 1/(n − 1)

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 50 / 72


Example- Alias method ... (1)

Example: Alias Method ...


7
• Let P1 = 16 , P2 = 12 , P3 = 16
1
stand for P. here n = 3 and hence
k = 1, 2
• lets’ presume following two 2-point mass-functions Q (1) & Q (2) to
constitute P.
• Q (1) : all weight on 3 and 2
 
• Q (2) : may be derived using Q (1) and Pj = 1
2 Qj(1) + Qj(2) j = 1, 2, 3

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 51 / 72


Example- Alias method ... (1)

Example: Alias Method ...


7
• Let P1 = 16 , P2 = 12 , P3 = 16
1
stand for P. here n = 3 and hence
k = 1, 2
• lets’ presume following two 2-point mass-functions Q (1) & Q (2) to
constitute P.
• Q (1) : all weight on 3 and 2
 
• Q (2) : may be derived using Q (1) and Pj = 1
2 Qj(1) + Qj(2) j = 1, 2, 3

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 51 / 72


Example- Alias method ...(2)

Example: Alias Method ...


• ⇒ Q3(2) = 0; and
• Q3(1) = 2P3 ; Q2(1) = 1 − Q3(1) = 1 ; Q1(1) = 0
8
• Q3(2) = 0; Q2(2) = 2P2 − 7 = 1 ; Q1(2) = 2P1 = 7
8 8 8
• Similar procedure may be followed for case of 4-point mass-function
P; n = 4 , k = 3

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 52 / 72


Example- Alias method ...(2)

Example: Alias Method ...


• ⇒ Q3(2) = 0; and
• Q3(1) = 2P3 ; Q2(1) = 1 − Q3(1) = 1 ; Q1(1) = 0
8
• Q3(2) = 0; Q2(2) = 2P2 − 7 = 1 ; Q1(2) = 2P1 = 7
8 8 8
• Similar procedure may be followed for case of 4-point mass-function
P; n = 4 , k = 3

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 52 / 72


Alias Method: General procedure

General procedure: Alias Method


• This method outlines proceure for any n-point pmf P can be written as follows;
We presume corollary/Lemma above for i and j;

1 X (k )
n−1
P= Q
n − 1 k =1

• Lets define Q (1) concentrating on points i and j


• which will contain all of the mass for point i by noting that in representtion
above, Qi(k ) = 0 for k = 2, 3, . . . , n − 1
• i.e. Qi(1) = (n − 1)Pi and so Qj(1) = 1 − (n − 1)Pi

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 53 / 72


Alias Method: General rocedure ... (2)

General procedure : Alias Method


• Writing
1 n − 2 (n−1)
P= Q (1) + P
n−1 n−1
• here, P(n−1) represents the remaining mass, hence

Pi(n−1) = 0
   
n−1 1 n−1 1
Pj(n−1) = Pj − Q (1)
= Pi + Pj −
n−2 n−1 j n−2 n−1
n−1
Pk(n−1) = P , k ̸= i or j
n−2 k

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 54 / 72


Alias Method: General rocedure ... (3)

General procedure : Alias Method


• Repeating to expand P(n−1) likewise

1 n − 3 (n−2)
P(n−1) = Q (2) + P
n−2 n−2
• Hence, full expansion,

1 1 n − 3 (n−2)
P= Q (1) + Q (2) + P
n−1 n−1 n−2

1  
P= Q (1) + Q (2) + . . . + Q (n−1)
n−1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 55 / 72


Algorithm for Alias method

Algorithm
The P can now be simulated as follows
• Generating random integer N equally likely to be either 1, 2, . . . (n − 1)
• If N is such that Q (N ) puts positive weight only on point iN and jN ,
• then we can set X equal to iN , if second random number is less that Qi(N ) ;
N
equal to jN , otherwise,
Step_1 Generate U1 and set N = 1 + [(n − 1)U1 ]
Step_2 Generate U2 and set

iN if U2 < Qi(NN )
X=
jN Otherwise

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 56 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 57 / 72
Multivariate Distributions

• Suppose that X1 , X2 , ....Xn have a given joint probability distribution


mass function

P {X1 = x1 , X2 = x2 , ....Xn = xn } = P {X1 = x1 }P {X2 = x2 |X1 = x1 }P {X3 =


P {Xn = xn |Xn−1 = xn−1 , ....X2 = x2 , X1 = x1 }

• Simulate random vector having the above distribution, by sequentially


from CONDITIONAL distribution

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 58 / 72


Multivariate Distributions ...

Example-1
From a set of n elements– numbered 1,2,.....,n– we wish to simulate the
choice of a random SUBSET of size k which can be chosen such that each
of n Ck subsets is equally likely to be chaosen.
Aliter-1 Generate U1 , U2 , ....Un , and then
select the indices of the k smallest (or largest) of the ngenerated values

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 59 / 72


Multivariate Distributions ...

Example-2
Simulate discrete random variables, which are equally likely to take on
values–
1,2,....n (such a random variable is[nU ] + 1)
Stopping when kdiscrete values have been obtained
This is convenient for small values of k

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 60 / 72


Multivariate Distributions ...

Example-3
Simulating a random subset of size k is equivalent to simulating a random vector I1 , I2 , .....In
Such that, P {Iii , = I, .... = Iik = 1, Ij = 0 otherwise} = n C1
k
After simulating random vector, we chose our subset of all i for which Ii = 1, as following

k
P {I1 = 1} =
n



k
n−1 if I1 = 0
P {I2 = 1|I1 } = = kn−I1

 k −1
−1
n−1 if I1 = 1
X
j −1
k− Ii
i =1
P {Ij = 1|I1 , I2 , .....Ij −1 } = j = 2, ....n
n−j +1

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 61 / 72


Multivariate Distributions ...
Example-3 contd.
Hence, we can simulate a random subset of size k as following,

1 if U1 < kn
I1 =
0 otherwise

1 if U2 < kn−
−1
I1
I2 =
0 otherwise
.. ..
. .

k −I1 −I2 −I3 −....Ij −1
1 if Uj < n−j +1
Ij =
0 otherwise

This process stops, when I1 + I2 + ... + Ij = k and random subset is


R = {i, Ii = 1}

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 62 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 63 / 72
Error minimization

Minimizing error while generating X


• Let X1 , X2 , . . . , Xn have given distribution,
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of E [g (X )]
• θ = E [g (X1 , X2 , . . . Xn )]

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 64 / 72


Error minimization

Minimizing error while generating X


• Let X1 , X2 , . . . , Xn have given distribution,
• having density function f (x1 , x2 . . . , xn )
• Objective is to find the expected value of E [g (X )]
• θ = E [g (X1 , X2 , . . . Xn )]

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 64 / 72


Error minimization ...

Minimizing error while generating X


 
• Generate X1(1) , X2(1) , . . . Xn(1) and then Y (1) = g X1(1) , X2(1) , . . . Xn(1)
&  
• Generate Y (2) = g X1(2) , X2(2) , . . . Xn(2)

X
k h i  2   
Ψ=
• Y Yi /k and E Y = θ, E Y −θ = Var Y
i =1
 2 
• Y is an estimator of θ; we wish to minimize E Y −θ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 65 / 72


Error minimization ...

Minimizing error while generating X


 
• Generate X1(1) , X2(1) , . . . Xn(1) and then Y (1) = g X1(1) , X2(1) , . . . Xn(1)
&  
• Generate Y (2) = g X1(2) , X2(2) , . . . Xn(2)

X
k h i  2   
Ψ=
• Y Yi /k and E Y = θ, E Y −θ = Var Y
i =1
 2 
• Y is an estimator of θ; we wish to minimize E Y −θ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 65 / 72


Error minimization ...

Minimizing error while generating X


 
• Generate X1(1) , X2(1) , . . . Xn(1) and then Y (1) = g X1(1) , X2(1) , . . . Xn(1)
&  
• Generate Y (2) = g X1(2) , X2(2) , . . . Xn(2)

X
k h i  2   
Ψ=
• Y Yi /k and E Y = θ, E Y −θ = Var Y
i =1
 2 
• Y is an estimator of θ; we wish to minimize E Y −θ

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 65 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 66 / 72
Use of Anti-thetic variables

Let Y1 and Y2 be idetically distributed RVs, with mean θ


 
Y1 + Y2 1
Var = (Var (Y1 ) + Var (Y1 ) + 2 Cov (Y1 , Y2 ))
2 4
Var (Y1 ) Cov (Y1 , Y2 )
= +
2 2

Variance reduction- Using Antithetic variables


• It would be more advantageous for Y1 and Y2 NOT to be idetically
distributed, BUT negatively correlated
• X1 , X2 , . . . , Xn are independent and simulated by inverse method from
U
• Xi is simulated from Fi−1 (U )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 67 / 72


Use of Anti-thetic variables

Let Y1 and Y2 be idetically distributed RVs, with mean θ


 
Y1 + Y2 1
Var = (Var (Y1 ) + Var (Y1 ) + 2 Cov (Y1 , Y2 ))
2 4
Var (Y1 ) Cov (Y1 , Y2 )
= +
2 2

Variance reduction- Using Antithetic variables


• It would be more advantageous for Y1 and Y2 NOT to be idetically
distributed, BUT negatively correlated
• X1 , X2 , . . . , Xn are independent and simulated by inverse method from
U
• Xi is simulated from Fi−1 (U )

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 67 / 72


Use of Anti-thetic variables ...

Variance reduction- Using Antithetic variables


• Y1 = g F1−1 (U1 ), F2−1 (U2 ), . . . F −n (Un )


• Since, (1 − U ) is also uniform (0, 1) and is negatively correlated with U


• Y2 = g F1−1 (1 − U1 ), F2−1 (1 − U2 ), . . . F −n (1 − Un )


• Hence, if Y1 and Y2 are negatively correlated, then generating Y2 by


this method would lead to smaller variance

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 68 / 72


Use of Anti-thetic variables ...

Variance reduction- Using Antithetic variables


• Y1 = g F1−1 (U1 ), F2−1 (U2 ), . . . F −n (Un )


• Since, (1 − U ) is also uniform (0, 1) and is negatively correlated with U


• Y2 = g F1−1 (1 − U1 ), F2−1 (1 − U2 ), . . . F −n (1 − Un )


• Hence, if Y1 and Y2 are negatively correlated, then generating Y2 by


this method would lead to smaller variance

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 68 / 72


Use of Anti-thetic variables ...

Variance reduction- Using Antithetic variables


• Y1 = g F1−1 (U1 ), F2−1 (U2 ), . . . F −n (Un )


• Since, (1 − U ) is also uniform (0, 1) and is negatively correlated with U


• Y2 = g F1−1 (1 − U1 ), F2−1 (1 − U2 ), . . . F −n (1 − Un )


• Hence, if Y1 and Y2 are negatively correlated, then generating Y2 by


this method would lead to smaller variance

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 68 / 72


Outline
1 Simulation of a Random Variable
What is Monte Carlo simulation?
2 General techniques for simulating continuous RVs
Types of Random variables
Inverse transformation method
The rejection method
Hazard rate method
3 Special techniques for simulating continuous RVs
4 Simulating from discrete distributions
Binomial Distribution
Poisson distribution
Alias method
5 Multivariate distributions & Stochastic Processes
6 Variance reduction techniques
Use of antithetic variables
7 Determining optimal number of runs
V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 69 / 72
Determining the number of runs

Number of RUNs
• Let use simulation to generate i.i.d. (independent idetically
distributed) Y1 , Y2 , . . . , Yr having mean µ and variance σ2
• Yr = Y (1) +Y (2) ++Y (r )
r , we use Y r as an estimate of µ
h i
• The precision of this estimate is Var (Y r ) = E (Y r − µ)2 = σ2
r

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 70 / 72


Determining the number of runs

Number of RUNs
• Let use simulation to generate i.i.d. (independent idetically
distributed) Y1 , Y2 , . . . , Yr having mean µ and variance σ2
• Yr = Y (1) +Y (2) ++Y (r )
r , we use Y r as an estimate of µ
h i
• The precision of this estimate is Var (Y r ) = E (Y r − µ)2 = σ2
r

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 70 / 72


Determining the number of runs

Number of RUNs
• Let use simulation to generate i.i.d. (independent idetically
distributed) Y1 , Y2 , . . . , Yr having mean µ and variance σ2
• Yr = Y (1) +Y (2) ++Y (r )
r , we use Y r as an estimate of µ
h i
• The precision of this estimate is Var (Y r ) = E (Y r − µ)2 = σ2
r

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 70 / 72


Determining the number of runs ...

Number of RUNs
2
• We wish to choose r sufficiently large so that σr is acceptably small
• BUT σ2 is not known in advance
• To get around this, we initially simulate k times to evalaute σ2 , and
use simulated Y (1) , Y (2) , . . . Y (k ) to estimate σ2 by sample variance
 2
X Y (k ) − Y k

k −1
• Based on this estimate, rest of the r − k values can be simulated

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 71 / 72


Determining the number of runs ...

Number of RUNs
2
• We wish to choose r sufficiently large so that σr is acceptably small
• BUT σ2 is not known in advance
• To get around this, we initially simulate k times to evalaute σ2 , and
use simulated Y (1) , Y (2) , . . . Y (k ) to estimate σ2 by sample variance
 2
X Y (k ) − Y k

k −1
• Based on this estimate, rest of the r − k values can be simulated

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 71 / 72


Determining the number of runs ...

Number of RUNs
2
• We wish to choose r sufficiently large so that σr is acceptably small
• BUT σ2 is not known in advance
• To get around this, we initially simulate k times to evalaute σ2 , and
use simulated Y (1) , Y (2) , . . . Y (k ) to estimate σ2 by sample variance
 2
X Y (k ) − Y k

k −1
• Based on this estimate, rest of the r − k values can be simulated

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 71 / 72


Determining the number of runs ...

Number of RUNs
2
• We wish to choose r sufficiently large so that σr is acceptably small
• BUT σ2 is not known in advance
• To get around this, we initially simulate k times to evalaute σ2 , and
use simulated Y (1) , Y (2) , . . . Y (k ) to estimate σ2 by sample variance
 2
X Y (k ) − Y k

k −1
• Based on this estimate, rest of the r − k values can be simulated

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 71 / 72


Determining the number of runs ...

Number of RUNs
2
• We wish to choose r sufficiently large so that σr is acceptably small
• BUT σ2 is not known in advance
• To get around this, we initially simulate k times to evalaute σ2 , and
use simulated Y (1) , Y (2) , . . . Y (k ) to estimate σ2 by sample variance
 2
X Y (k ) − Y k

k −1
• Based on this estimate, rest of the r − k values can be simulated

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 71 / 72


Determining the number of runs ...

Number of RUNs
2
• We wish to choose r sufficiently large so that σr is acceptably small
• BUT σ2 is not known in advance
• To get around this, we initially simulate k times to evalaute σ2 , and
use simulated Y (1) , Y (2) , . . . Y (k ) to estimate σ2 by sample variance
 2
X Y (k ) − Y k

k −1
• Based on this estimate, rest of the r − k values can be simulated

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 71 / 72


Thanks!

Best Wishes for professional growth!!

V. Sahula (MNIT Jaipur) Simulation of an Random Variable Modeling & Optimization 72 / 72

You might also like