0% found this document useful (0 votes)
53 views

Random Processes: Saravanan Vijayakumaran Sarva@ee - Iitb.ac - in

This document discusses random processes, which are indexed collections of random variables. It defines discrete-time and continuous-time random processes and provides examples. It also discusses specifications of random processes including their joint distributions and stationarity. The key properties of stationary random processes are their mean and autocorrelation functions. It describes ergodic processes and how random processes are affected by passing through linear time-invariant systems.

Uploaded by

Shon
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views

Random Processes: Saravanan Vijayakumaran Sarva@ee - Iitb.ac - in

This document discusses random processes, which are indexed collections of random variables. It defines discrete-time and continuous-time random processes and provides examples. It also discusses specifications of random processes including their joint distributions and stationarity. The key properties of stationary random processes are their mean and autocorrelation functions. It describes ergodic processes and how random processes are affected by passing through linear time-invariant systems.

Uploaded by

Shon
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Random Processes

Saravanan Vijayakumaran
[email protected]

Department of Electrical Engineering


Indian Institute of Technology Bombay

April 10, 2015

1 / 12
Random Process
Definition
An indexed collection of random variables {Xt : t ∈ T }.

Discrete-time Random Process


A random process where the index set T = Z or {0, 1, 2, 3, . . .}.

Example: Random walk


T = {0, 1, 2, 3, . . .}, X0 = 0, Xn independent and equally likely to be ±1 for
n≥1
n
X
Sn = Xi
i=0

Continuous-time Random Process


A random process where the index set T = R or [0, ∞). The notation X (t) is
used to represent continuous-time random processes.

Example: Thermal Noise

2 / 12
Realization of a Random Process
• The outcome of an experiment is specified by a sample point ω in the
sample space Ω
• A realization of a random variable X is its value X (ω)
• A realization of a random process Xt is the function Xt (ω) of t
• A realization is also called a sample function of the random process.

Example
Consider Ω = [0, 1]. For each ω ∈ Ω, consider its dyadic expansion

X dn (ω)
ω= = 0.d1 (ω)d2 (ω)d3 (ω) · · ·
2n
n=1

where each dn (ω) is either 0 or 1.


An infinite number of coin tosses with Heads being 0 and Tails being 1 can be
associated with each ω as
Xn (ω) = dn (ω)
For each ω ∈ Ω, we get a realization of this random process.

3 / 12
Specification of a Random Process
• A random process is specified by the joint cumulative distribution of the
random variables
X (t1 ), X (t2 ), . . . , X (tn )
for any set of sample times {t1 , t2 , . . . , tn } and any n ∈ N

FX (t1 ),X (t2 ),...,X (tn ) (x1 , x2 , . . . , xn ) = Pr [X (t1 ) ≤ x1 , X (t2 ) ≤ x2 , . . . , X (tn ) ≤ xn ]

• For continuous-time random processes, the joint probability density is


sufficient
• For discrete-time random processes, the joint probability mass function
is sufficient
• Without additional restrictions, this requires specifying a lot of joint
distributions
• One restriction which simplifies process specification is stationarity

4 / 12
Stationary Random Process
Definition
A random process X (t) is said to be stationary in the strict sense or strictly
stationary if the joint distribution of X (t1 ), X (t2 ), . . . , X (tk ) is the same as the
joint distribution of X (t1 + τ ), X (t2 + τ ), . . . , X (tk + τ ) for all time shifts τ , all k ,
and all observation instants t1 , . . . , tk .

FX (t1 ),...,X (tk ) (x1 , . . . , xk ) = FX (t1 +τ ),...,X (tk +τ ) (x1 , . . . , xk )

Properties
• A stationary random process is statistically indistinguishable from a
delayed version of itself.
• For k = 1, we have
FX (t) (x) = FX (t+τ ) (x)
for all t and τ . The first order distribution is independent of time.
• For k = 2 and τ = −t1 , we have
FX (t1 ),X (t2 ) (x1 , x2 ) = FX (0),X (t2 −t1 ) (x1 , x2 )

for all t1 and t2 . The second order distribution depends only on t2 − t1 .


5 / 12
Mean Function
• The mean of a random process X (t) is the expectation of the random
variable obtained by observing the process at time t
Z ∞
µX (t) = E [X (t)] = xfX (t) (x) dx
−∞

• For a strictly stationary random process X (t), the mean is a constant


µX (t) = µ for all t

Example
X (t) = cos (2πft + Θ), Θ ∼ U[−π, π]. µX (t) =?

Example
Xn = Z1 + · · · + Zn , n = 1, 2, . . .
where Zi are i.i.d. with zero mean and variance σ 2 . µX (n) =?

6 / 12
Autocorrelation Function
• The autocorrelation function of a random process X (t) is defined as
Z ∞ Z ∞
RX (t1 , t2 ) = E [X (t1 )X (t2 )] = x1 x2 fX (t1 ),X (t2 ) (x1 , x2 ) dx1 dx2
−∞ −∞

• For a strictly stationary random process X (t), the autocorrelation


function depends only on the time difference t2 − t1

RX (t1 , t2 ) = RX (0, t2 − t1 ) for all t1 , t2

In this case, RX (0, t2 − t1 ) is simply written as RX (t2 − t1 )

Example
X (t) = cos (2πft + Θ), Θ ∼ U[−π, π]. RX (t1 , t2 ) =?

Example
Xn = Z1 + · · · + Zn , n = 1, 2, . . .
where Zi are i.i.d. with zero mean and variance σ 2 . RX (n1 , n2 ) =?

7 / 12
Wide-Sense Stationary Random Process
Definition
A random process X (t) is said to be wide-sense stationary or weakly
stationary or second-order stationary if

µX (t) = µX (0) for all t and


RX (t1 , t2 ) = RX (t1 − t2 , 0) for all t1 , t2 .

Remarks
• A strictly stationary random process is also wide-sense stationary if the
first and second order moments exist.
• A wide-sense stationary random process need not be strictly stationary.

Example
Is the following random process wide-sense stationary?

X (t) = A cos (2πfc t + Θ)

where A and fc are constants and Θ is uniformly distributed on [−π, π].


8 / 12
Properties of the Autocorrelation Function
• Consider the autocorrelation function of a wide-sense stationary
random process X (t)

RX (τ ) = E [X (t + τ )X (t)]

• RX (τ ) is an even function of τ
RX (τ ) = RX (−τ )

• RX (τ ) has maximum magnitude at τ = 0


|RX (τ )| ≤ RX (0)

• The autocorrelation function measures the interdependence of two


random variables obtained by measuring X (t) at times τ apart

9 / 12
Ergodic Processes
• Let X (t) be a wide-sense stationary random process with mean µX and
autocorrelation function RX (τ ) (also called the ensemble averages)
• Let x(t) be a realization of X (t)
• For an observation interval [−T , T ], the time average of x(t) is given by
Z T
1
µx (T ) = x(t) dt
2T −T

• The process X (t) is said to be ergodic in the mean if µx (T ) converges


to µX in the squared mean as T → ∞
• For an observation interval [−T , T ], the time-averaged autocorrelation
function is given by
Z T
1
Rx (τ, T ) = x(t + τ )x(t) dt
2T −T

• The process X (t) is said to be ergodic in the autocorrelation function if


Rx (τ, T ) converges to RX (τ ) in the squared mean as T → ∞

10 / 12
Passing a Random Process through an LTI System

X (t) LTI System Y (t)

• Consider a linear time-invariant (LTI) system h(t) which has random


processes X (t) and Y (t) as input and output
Z ∞
Y (t) = h(τ )X (t − τ ) dτ
−∞

• In general, it is difficult to characterize Y (t) in terms of X (t)


• If X (t) is a wide-sense stationary random process, then Y (t) is also
wide-sense stationary
Z ∞
µY (t) = µX h(τ ) dτ
−∞
Z ∞ Z ∞
RY (τ ) = h(τ1 )h(τ2 )RX (τ − τ1 + τ2 ) dτ1 dτ2
−∞ −∞

11 / 12
Reference
• Chapter 1, Communication Systems, Simon Haykin,
Fourth Edition, Wiley-India, 2001.

12 / 12

You might also like