0% found this document useful (0 votes)
78 views

EE4601 Communication Systems: Week 3 Random Processes, Stationarity, Means, Correlations

This document discusses random processes and some key concepts related to them including stationarity, means, correlations, and properties of the autocorrelation function. It provides examples to illustrate these concepts such as calculating the ensemble mean and autocorrelation of a random process defined as a cosine signal with a random phase. It defines what makes a random process wide sense stationary and lists properties that the autocorrelation function must satisfy for a wide sense stationary process.

Uploaded by

ragvshah
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views

EE4601 Communication Systems: Week 3 Random Processes, Stationarity, Means, Correlations

This document discusses random processes and some key concepts related to them including stationarity, means, correlations, and properties of the autocorrelation function. It provides examples to illustrate these concepts such as calculating the ensemble mean and autocorrelation of a random process defined as a cosine signal with a random phase. It defines what makes a random process wide sense stationary and lists properties that the autocorrelation function must satisfy for a wide sense stationary process.

Uploaded by

ragvshah
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

EE4601 Communication Systems

Week 3 Random Processes, Stationarity, Means, Correlations

c 2011, Georgia Institute of Technology (lect3 1)

Random Processes

A random process or stochastic process, X (t), is an ensemble of sample functions {X1 (t), X2(t), . . . , X (t)} together with a probability rule which assigns a probability to any meaningful event associated with the observation of these sample functions. Suppose the sample function Xi (t) corresponds to the sample point si in the sample space S and occurs with probability Pi.

may be nite or innite. Sample functions may be dened at discrete or continuous time instants. this denes discrete- or continuous-time random processes. Sample function values may take on discrete or continuous values.

this denes discrete- or continuous-parameter random processes.

c 2013, Georgia Institute of Technology (lect3 2)

Random Processes
X 1( t )

sample space S

s1 s2 s

X2( t )

X ( t )

c 2011, Georgia Institute of Technology (lect3 3)

Random Processes vs. Random Variables

What is the dierence between random variable and processes? For a random variable, the outcome of a random experiment is mapped onto a variable, e.g., a number. For a random processes, the outcome of a random experiment is mapped onto a waveform that is a function of time. Suppose that we observe a random process X (t) at some time t1 to generate the observation X (t1 ) and that the number of possible sample functions or waveforms, , is nite. If Xi (t1) is observed with probability Pi , then the collection of numbers {Xi (t1)}, i = 1, 2, . . . , forms a random variable, denoted by X (t1 ), having the probability distribution Pi , i = 1, 2, . . . , .

c 2012, Georgia Institute of Technology (lect3 4)

Random Processes
FX (t1),...,
X (tn ) (x1 , . . . ,

The collection of n random variables, X (t1 ), . . . , X (tn ), has the joint cdf xn ) = Pr(X (t1) < x1 , . . . , X (tn ) < xn ) .

A more compact notation can be obtained by dening the vectors x = (x1 , x2 , . . . , xn )T X(t) = (X (t1), X (t2), . . . , X (tn ))T Then the joint cdf and joint pdf of X(t) are, respectively, FX(t) (x) = P (X(t) x) n FX(t) (x) pX(t) (x) = x1x2 xn A random process is strictly stationary if and only if the equality pX(t) (x) = pX(t+ ) (x)

holds for all sets of time instants {t1 , t2, . . . , tn} and all time shifts .

c 2011, Georgia Institute of Technology (lect3 5)

Ensemble and Time Averages


E[ ] = ensemble average < > = time average

For a random process, we dene the following two operators

The ensemble mean or ensemble average of a random process X (t) at time t is X (t) E[X (t)] =

xpX (t)(x)dx

The time average mean or time average of a random process X (t) is < X (t) >= lim 1 T 2T
T T

X (t)dt

In general, the time average mean < X (t) > is also a random variable, because it depends on the particular sample function that is observed for time averaging.

c 2013, Georgia Institute of Technology (lect3 6)

Example
X1(t) = a P1 = 1/4

Consider the random process shown below.

X2(t) = 0

P2 = 1/2

X3(t) = -a

P3 = 1/4

c 2011, Georgia Institute of Technology (lect3 7)

Example
E[X (t)] = X1 (t)P1 + X2 (t)P2 + X3 (t)P3 = a 1/4 + 0 1/2 + (a) 1/4 = 0

The ensemble mean is

The time average mean is X (t) =


a 0 a

with probability 1/4 with probability 1/2 with probability 1/4

Note that X (t) is a random variable (since it depends on the sample function that is chosen for time averaging, while E[X (t)] is just a number (that in the above example is not a function of time t, but in general may a function of the time variable t).

c 2011, Georgia Institute of Technology (lect3 8)

Moments and Correlations


xfX (t1 ) (x)dx 2 (xX ) fX (t1 ) (x)dx

E[ ] = ensemble average operator. [Ensemble] Mean: X (t1 ) = E[X (t1)] =

2 [Ensemble] Variance: X (t1) = E[(X (t1)X (t1 ))2] =

[Ensemble] Autocorrelation: XX (t1 , t2) = E[X (t1)X (t2 )] [Ensemble] Autocovariance: XX (t1 , t2) = E[(X (t1) X (t1 ))(X (t2) X (t2))] = XX (t1, t2 ) X (t1 )X (t2 ) If X (t) has zero mean, then XX (t1 , t2) = XX (t1 , t2).

c 2013, Georgia Institute of Technology (lect3 9)

Example
X (t) = A cos(2fc t + )

Consider the random process

where A and fc are constants. The phase is assumed to be a uniformly distributed random variable with pdf f ( ) =

1/(2 ) , 0 2 0 , elsewhere

The ensemble mean of X (t1 ) is obtained by averaging over the pdf of : X (t1 ) = E [A cos(2fct1 + )] A = cos(2fc t1 + )d 2 A = sin(2fc t1 + ) 2 = 0

c 2013, Georgia Institute of Technology (lect3 10)

Example (contd)

The autocorrelation of X (t) = A cos(2fct + ) is XX (t1, t2) = E[X (t1 )X (t2 )] = E[A2 cos(2fc t1 + ) cos(2fct2 + )] A2 A2 = E [cos(2fc t1 + 2fc t2 + 2)] + E[cos(2fc (t1 t2 )] 2 2 But E[cos(2fct1 + 2fc t2 + 2)] = 1 cos(2fct1 + 2fc t2 + 2)d 2 1 = sin(2fct1 + 2fc t2 + 2)d 4 = 0

c 2011, Georgia Institute of Technology (lect3 11)

Example (contd)
E[cos(2fc(t1 t2 )] = cos 2fc (t1 t2 )

Also, Hence, A2 cos 2fc (t1 t2 ) XX (t1, t2 ) = 2 A2 = cos 2fc , = t1 t2 2 The autocovariance of X (t) is XX (t1, t2 ) = XX (t1 , t2) X (t1)X (t2 ) = XX ( ) since X (t) = 0.

c 2011, Georgia Institute of Technology (lect3 12)

Wide Sense Stationary


X (t) = X a constant X (t1, t2 ) = X ( ) where = t2 t1

A wide sense stationary random process X (t) has the property

The autocorrelation function only depends on the time dierence . If a random process is strictly stationary, then it is wide sense stationary. The converse is not true. strictly stationary wide sense stationary For a Gaussian random process only strictly stationary wide sense stationary The previous example is a wide sense stationary random process.

c 2011, Georgia Institute of Technology (lect3 13)

Some Properties of XX ( )

The autocorrelation function, XX ( ), of a wide sense stationary random process X (t) satises the following properties. 1. XX (0) = E[X 2 (t)]: total power ac + dc 2. XX ( ) = XX ( ): even function 3. |XX ( )| XX (0): a variant of the Cauchy-Schwartz inequality. Proof on next slide. 4. XX () = E2 [X (t)] = 2 X : dc power, if X (t) has no periodic components. 5. If pX (t) (x) = pX (t+T )(x), i.e., the pdf of X (t) is periodic in t with period T , then XX ( ) = XX ( + T ). In other words, if pX (t) (x) is periodic in t with period T , then XX ( ) is periodic in with period T . Such a random process is said to be periodic wide sense stationary or cyclostationary. Digitally modulated waveforms are cyclostationary random processes.

c 2011, Georgia Institute of Technology (lect3 14)

Some Properties of XX ( )

The inequality |XX ( )| XX (0) can be established through the following steps. 0 = = = = Therefore, XX ( ) XX (0) |XX ( )| XX (0) . E[(X (t + ) X (t))2] E[X 2(t) + X 2 (t + ) X (t + )X (t)] E[X 2(t)] + E[X 2 (t + )] E[X (t + )X (t)] 2E[X 2(t)] E[X (t + )X (t)] 2XX (0) 2XX ( ) .

c 2011, Georgia Institute of Technology (lect3 15)

You might also like