Unit-4 and 5
Unit-4 and 5
Classification of random process: Random processes are mainly classified into four types
based on the time and random variable X as follows.
2. Discrete Random Process: In discrete random process, the random variable X has
only discrete values while time, t is continuous. The below figure shows a discrete
random process. A digital encoded signal has only two discrete values a positive level
and a negative level but time is continuous. So it is a discrete random process.
4. Discrete Random Sequence: In discrete random sequence both random variable X and
time t are discrete. It can be obtained by sampling and quantizing a random signal.
This is called the random process and is mostly used in digital signal processing
applications. The amplitude of the sequence can be quantized into two levels or multi
levels as shown in below figure s (d) and (e).
Joint distribution functions of random process: Consider a random process X(t). For a
single random variable at time t1, X1=X(t1), The cumulative distribution function is defined as
FX(x1;t1) = P {(X(t1) x1} where x1 is any real number. The function FX(x1;t1) is known as the
first order distribution function of X(t). For two random variables at time instants t 1 and t2
X(t1) = X1 and X(t2) = X2, the joint distribution is called the second order joint distribution
function of the random process X(t) and is given by
FX(x1, x2 ; t1, t2) = P {(X(t1) x1, X(t2) x2}. In general for N random variables at N time
th
intervals X(ti) = Xi i=1,2,…N, the N order joint distribution function of X(t) is defined as
FX(x1, x2…… xN ; t1, t2,….. tN) = P {(X(t1) x1, X(t2) x2,…. X(tN) xN}.
Joint density functions of random process:: Joint density functions of a random process
can be obtained from the derivatives of the distribution functions.
th
3. N order density function: fX(x1, x2…… xN ; t1, t2,….. tN) =
Independent random processes: Consider a random process X(t). Let X(ti) = xi, i= 1,2,…N
be N Random variables defined at time constants t1,t2, … t N with density functions fX(x1;t1),
th
fX(x2;t2), … fX(xN ; tN). If the random process X(t) is statistically independent, then the N
order joint density function is equal to the product of individual joint functions of X(t) i.e.
fX(x1, x2…… xN ; t1, t2,….. tN) = fX(x1;t1) fX(x2;t2). . . . fX(xN ; tN). Similarly the joint distribution
will be the product of the individual distribution functions.
Statistical properties of Random Processes: The following are the statistical properties of
random processes.
RXX(t1,t2) = ∫
3. Cross correlation: Consider two random processes X(t) and Y(t) defined with
random variables X and Y at time instants t1 and t2 respectively. The joint density
function is fxy(x,y ; t1,t2).Then the correlation of X and Y, E[XY] = E[X(t1) Y(t2)] is
called the cross correlation function of the random processes X(t) and Y(t) which is
defined as
RXY(t1,t2) = E[X Y] = E[X(t1) Y(t2)] or
RXY(t1,t2) = ∫
Stationary Processes: A random process is said to be stationary if all its statistical properties
such as mean, moments, variances etc… do not change with time. The stationarity which
K.RAVEENDRA, Associate Professor, In-charge Examinations Branch, SVEW, TPT. Page 3
Probability
depends onTthe
heory and Sfunctions
density tochastic Phas
rocesses (13A04304)
different n
levels or orders.
4. Strict sense stationary (SSS) processes: A random process X(t) is said to be strict
Sense stationary if its Nth order joint density function does not change with time or
shift in time value. i.e.
fX(x1, x2…… xN ; t1, t2,….. tN) = fX(x1, x2…… xN ; t1+∆t, t2+∆t, . . . tN+∆t) for all t1, t2 . . . tN
and ∆t. A process that is stationary to all orders n=1,2,. . . N is called strict sense
stationary process. Note that SSS process is also a WSS process. But the reverse is not
true.
Time Average Function: Consider a random process X(t). Let x(t) be a sample function
which exists for all time at a fixed value in the given sample space S. The average value of
∫
x(t) taken over all times is called the time average of x(t). It is also called mean value of x(t).
It can be expressed as = A[x(t)] =
Time autocorrelation function: Consider a random process X(t). The time average of the
∫
product X(t) and X(t+ τ) is called time average autocorrelation function of x(t) and is denoted
as Rxx(τ) = A[X(t) X(t+τ)] or Rxx(τ) =
.
2
∫
Time mean square function: If τ = 0, the time average of x (t) is called time mean square
2
value of x(t) defined as = A[X (t)] =
K.RAVEENDRA, Associate Professor, In-charge Examinations Branch, SVEW, TPT. Page 5
Probability Theory and Stochastic Processes (13A04304) n
.
Time cross correlation function: Let X(t) and Y(t) be two random processes with sample
functions x(t) and y(t) respectively. The time average of the product of x(t) y(t+ τ) is called
∫
time cross correlation function of x(t) and y(t). Denoted as
Rxy(τ) =
Ergodic Theorem and Ergodic Process: The Ergodic theorem states that for any random
statistical or ensemble averages of X(t). i.e. = ̅ or Rxx(τ) = RXX(τ) . Random processes that
process X(t), all time averages of sample functions of x(t) are equal to the corresponding
Joint Ergodic Process: Let X(t) and Y(t) be two random processes with sample functions
x(t) and y(t) respectively. The two random processes are said to be jointly Ergodic if they are
average of any sample function x(t) is equal to its statistical average, ̅ which
the probability of all other sample functions is equal to one. i.e. E[X(t)] = ̅ = A[x(t)] =
Mean Ergodic Random Process: A random process X(t) is said to be mean Ergodic if time
is constant and
with probability one for all x(t).
Autocorrelation Ergodic Process: A stationary random process X(t) is said to be
Autocorrelation Ergodic if and only if the time autocorrelation function of any sample
function x(t) is equal to the statistical autocorrelation function of X(t). i.e. A[x(t) x(t+τ)] =
E[X(t) X(t+τ)] or Rxx(τ) = RXX(τ).
Cross Correlation Ergodic Process: Two stationary random processes X(t) and Y(t) are
said to be cross correlation Ergodic if and only if its time cross correlation function of sample
functions x(t) and y(t) is equal to the statistical cross correlation function of X(t) and Y(t). i.e.
A[x(t) y(t+τ)] = E[X(t) Y(t+τ)] or Rxy(τ) = RXY(τ).
| |
RXX(0 RXX(τ) 0 or
RXX(0) hence proved.
3. RXX(τ) is an even function of τ i.e. RXX(-τ) = RXX(τ).
Proof: We know that RXX(τ) = E[X(t) X(t+ τ)]
Let τ = - τ then
RXX(-τ) = E[X(t) X(t- τ)]
Let u=t- τ or t= u+ τ
Therefore RXX(-τ) = E[X(u+ τ) X(u)] = E[X(u) X(u+ τ)]
= ̅ .
components, as| |
RXX(τ) = E[X(t)X(t+τ)] = E[X(t1) X(t2)]. Since the process has no periodic
, the random variable becomes independent,
i.e.
| |
Since X(t) is Ergodic E[X(t1)] = E[ X(t2)] = ̅
= E[X(t1) X(t2)] = E[X(t1)] E[ X(t2)]
Therefore | | = ̅ hence
proved.
5. If X(t) is periodic then its autocorrelation function is also periodic.
Proof: Consider a Random process X(t) which is periodic with period T0
Then X(t) = X(t T0) or
X(t+ ) = X(t T0). Now we have RXX(τ) = E[X(t)X(t+τ)] then
RXX(τ T0) = E[X(t)X(t+τ T0)]
Given X(t) is WSS, RXX(τ T0) = E[X(t)X(t+τ)]
RXX(τ T0) = RXX(τ)
Therefore RXX(τ) is periodic hence proved.
6. If X(t) is Ergodic has zero mean, and no periodic components then
| |
= ̅ . Where ̅
= .
Proof: It is already proved that | | is the mean
value of
X(t) which is given as zero.
Therefore | | = hence
proved.
7. The autocorrelation function of random process RXX(τ) cannot have any arbitrary
shape.
Proof: The autocorrelation function RXX(τ) is an even function of and has maximum
value at the origin. Hence the autocorrelation function cannot have an arbitrary shape
hence proved.
8. If a random process X(t) with zero mean has the DC component A as Y(t) =A + X(t),
2
Then RYY(τ) = A + RXX(τ).
Proof: Given a random process Y(t) =A + X(t).
We know that RYY(τ) = E[Y(t)Y(t+τ)] =E[(A + X(t)) (A + X(t+ τ))]
2
= E[(A + AX(t) + AX(t+ τ)+ X(t) X(t+ τ)]
2
= E[(A ] + AE[X(t)] + E[AX(t+ τ)]+ E[X(t) X(t+ τ)]
2
=A +0+0+ RXX(τ).
2
Therefore RYY(τ) = A + RXX(τ) hence proved.
9. If a random process Z(t) is sum of two random processes X(t) and Y(t)
i.e, Z(t) = X(t) + Y(t). Then RZZ(τ) = RXX(τ)+ RXY(τ)+ RYX(τ)+ RYY(τ)
Proof: Given Z(t) = X(t) + Y(t).
We know that RZZ(τ) = E[Z(t)Z(t+τ)]
= E[(X(t)+Y(t)) (X(t+τ)Y(t+τ))]
= E[(X(t) X(t+τ)+ X(t) Y(t+τ) +Y(t) X(t+τ) +Y(t) Y(t+τ))]
= E[(X(t) X(t+τ)]+ E[X(t) Y(t+τ)] +E[Y(t) X(t+τ)] +E[Y(t) Y(t+τ))]
K.RAVEENDRA, Associate Professor, In-charge Examinations Branch, SVEW, TPT. Page 8
Probability Theory R
Therefore Stochastic
and(τ) = RXX(τ)+Processes
RXY(τ)+(13A04304) n hence proved.
RYX(τ)+ RYY(τ)
ZZ
Properties of Cross Correlation Function: Consider two random processes X(t) and Y(t)
are at least jointly WSS. And the cross correlation function is a function of the time
difference τ = t2-t1. Then the following are the properties of cross correlation function.
1. RXY(τ) = RYX(-τ) is a Symmetrical property.
Proof: We know that RXY(τ) = E[X(t) Y(t+ τ)] also
RYX(τ) = E[Y(t) X(t+ τ)]
Let τ = - τ then
.
Proof: Consider two random processes X(t) and Y(t) with auto correlation functions
and . Also consider the inequality
√ √
2
E[ ] = 0
√ √ √
E[ ] 0
√ √ √
E[ ] 0
2 2
We know that E[X (t)] = RXX(0) and E[Y (t)] = RYY(0) and E[X(t) X(t+ τ)] = RXY(τ)
√
Therefore 0
√
2 0
√
1 0
√ . | | Or
| | √ hence proved.
Hence the absolute value of the cross correlation function is always less than or equal
to the geometrical mean of the autocorrelation functions.
Proof: Let two random processes X(t) and Y(t) be jointly WSS, then we know that
RXY(τ) =E[X(t) Y(t+ τ)] Since X(t) and Y(t) are independent
Therefore at τ = 0, the auto covariance function becomes the Variance of the random process.
The autocorrelation coefficient of the random process, X(t) is defined as
√
(t, t+τ) = if τ =0,
Cross Covariance Function: If two random processes X(t) and Y(t) have random variables
X(t) and Y(t+ τ), then the cross covariance function can be defined as
CXY(t, t+τ) = E[(X(t)-E[X(t)]) ((Y(t+τ) – E[Y(t+τ)])] or
CXY(t, t+τ) = RXY(t, t+τ) - E[(X(t) E[Y(t+τ)]. If X(t) and Y(t) are jointly WSS, then
The cross correlation coefficient of random processes X(t) and Y(t) is defined as
√
(t, t+τ) = if τ =0,
√
(0) = = .
Gaussian Random Process: Consider a continuous random process X(t). Let N random
variables X1=X(t1),X2=X(t2), . . . ,XN =X(tN) be defined at time intervals t1, t2, . . . tN
respectively. If random variables are jointly Gaussian for any N=1,2,…. And at any time
instants t1,t2,. . . tN. Then the random process X(t) is called Gaussian random process. The
Gaussian density function is given as
̅̅̅̅̅̅̅̅
where CXX is a covariance matrix.
P[X(t)=K] = , K=0,1,2, . . .
(x) = ∑ (x-k).
UNIT-4: RANDOM PROCESSES:SPECTRAL
CHARACTERISTICS
Consider a random process X (t). The amplitude of the random process, when it
varies randomly with time, does not satisfy Dirichlet’s conditions. Therefore it is not
possible to apply the Fourier transform directly on the random process for a frequency
domain analysis. Thus the autocorrelation function of a WSS random process is used to
study spectral characteristics such as power density spectrum or power spectral density
(psd).
Power Density Spectrum: The power spectrum of a WSS random process X (t) is defined
as the Fourier transform of the autocorrelation function RXX (τ) of X (t). It can be expressed
as
RXX (τ) =
Therefore, the power density spectrum S XX(ω) and the autocorrelation function R XX (τ) are
Fourier transform pairs.
Average power of the random process: The average power PXX of a WSS random process
X(t) is defined as the time average of its second order moment or autocorrelation function
at τ =0.
Mathematically, PXX = A
2
{E[X (t)]}
PXX =
Or PXX = τ
RXX (τ) =
K.RAVEENDRA, Associate Professor, In-charge Examinations Branch, SVEW, Page 1
TPT.
At τ =0 PXX = RXX (0) =
Properties of power density spectrum: The properties of the power density spectrum
SXX(ω) for a WSS random process X(t) are given as
(1) SXX(ω)
Proof: From the definition, the expected value of a non negative function E[ ]
is always non-negative.
(2) The power spectral density at zero frequency is equal to the area under the curve
of the autocorrelation RXX (τ) i.e. SXX(0) =
(3) The power density spectrum of a real process X(t) is an even function
Substitute τ = -τ then
SXX(-ω) =
Since X(t) is real, from the properties of autocorrelation we know that, RXX (-τ) = RXX (τ)
Therefore SXX(-ω) =
Since the function is a real function, SXX(ω) is always a real function hence
proved. (5) If SXX(ω) is a psd of the WSS random process X(t), then
2
= A {E[X (t)]} = RXX (0) or The time average of the mean square value of a
WSS random process equals the area under the curve of the power spectral
2
RXX (0) = A {E[X (t)]} = = Area under the curve of the power spectral
density. Hence proved.
(6) If X(t) is a WSS random process with psd S XX(ω), then the psd of the derivative of
2 2
X(t) is equal to ω times the psd SXX(ω). That is S (ω) = ω SXX(ω).
= T(ω) = =
Therefore S (ω) =
S (ω) =
2
Therefore S (ω) = ω SXX(ω) hence proved.
Cross power density spectrum: Consider two real random processes X(t) and Y(t).
which are jointly WSS random processes, then the cross power density spectrum is defined
as the Fourier transform of the cross correlation function of X(t) and Y(t).and is
expressed as SXY(ω) = and SYX(ω) = by
inverse Fourier transformation, we can obtain the cross correlation functions as
Therefore the cross psd and cross correlation functions are forms a Fourier transform pair.
If XT(ω) and YT(ω) are Fourier transforms of X(t) and Y(t) respectively in interval [-
T,T], Then the cross power density spectrum is defined as
PXY = or
Properties of cross power density spectrum: The properties of the cross power for
real random processes X(t) and Y(t) are given by
Proof: Consider the cross correlation function τ The cross power density spectrum is
SXY(ω) =
Let τ = - τ Then
SXY(ω) =
(2) The real part of SXY(ω) and real part SYX(ω) are even functions of ω i.e.
Re [SXY(ω)] and Re [SYX(ω)] are even functions.
=cosωt-jsinωt, Re [SXY(ω)] = ω
Re [SXY(ω)] = ω
= ω
(3) The imaginary part of SXY(ω) and imaginary part SYX(ω) are odd functions of ω i.e.
Im [SXY(ω)] and Im [SYX(ω)] are odd functions.
=cosωt-jsinωt,
Im [SXY(ω)] = ω
=- ω
= - Im [SXY(ω)]
(5) If X(t) and Y(t) are uncorrelated and have mean values and , then
SXY(ω)=2 .
= SXY(ω) =
Therefore SXY(ω) =
SXY(ω) =
SXY(ω) =