EE4601 Communication Systems: Week 3 Random Processes, Stationarity, Means, Correlations
EE4601 Communication Systems: Week 3 Random Processes, Stationarity, Means, Correlations
Random Processes
A random process or stochastic process, X (t), is an ensemble of sample functions {X1 (t), X2(t), . . . , X (t)} together with a probability rule which assigns a probability to any meaningful event associated with the observation of these sample functions. Suppose the sample function Xi (t) corresponds to the sample point si in the sample space S and occurs with probability Pi.
may be nite or innite. Sample functions may be dened at discrete or continuous time instants. this denes discrete- or continuous-time random processes. Sample function values may take on discrete or continuous values.
Random Processes
X 1( t )
sample space S
s1 s2 s
X2( t )
X ( t )
What is the dierence between random variable and processes? For a random variable, the outcome of a random experiment is mapped onto a variable, e.g., a number. For a random processes, the outcome of a random experiment is mapped onto a waveform that is a function of time. Suppose that we observe a random process X (t) at some time t1 to generate the observation X (t1 ) and that the number of possible sample functions or waveforms, , is nite. If Xi (t1) is observed with probability Pi , then the collection of numbers {Xi (t1)}, i = 1, 2, . . . , forms a random variable, denoted by X (t1 ), having the probability distribution Pi , i = 1, 2, . . . , .
Random Processes
FX (t1),...,
X (tn ) (x1 , . . . ,
The collection of n random variables, X (t1 ), . . . , X (tn ), has the joint cdf xn ) = Pr(X (t1) < x1 , . . . , X (tn ) < xn ) .
A more compact notation can be obtained by dening the vectors x = (x1 , x2 , . . . , xn )T X(t) = (X (t1), X (t2), . . . , X (tn ))T Then the joint cdf and joint pdf of X(t) are, respectively, FX(t) (x) = P (X(t) x) n FX(t) (x) pX(t) (x) = x1x2 xn A random process is strictly stationary if and only if the equality pX(t) (x) = pX(t+ ) (x)
holds for all sets of time instants {t1 , t2, . . . , tn} and all time shifts .
The ensemble mean or ensemble average of a random process X (t) at time t is X (t) E[X (t)] =
xpX (t)(x)dx
The time average mean or time average of a random process X (t) is < X (t) >= lim 1 T 2T
T T
X (t)dt
In general, the time average mean < X (t) > is also a random variable, because it depends on the particular sample function that is observed for time averaging.
Example
X1(t) = a P1 = 1/4
X2(t) = 0
P2 = 1/2
X3(t) = -a
P3 = 1/4
Example
E[X (t)] = X1 (t)P1 + X2 (t)P2 + X3 (t)P3 = a 1/4 + 0 1/2 + (a) 1/4 = 0
a 0 a
Note that X (t) is a random variable (since it depends on the sample function that is chosen for time averaging, while E[X (t)] is just a number (that in the above example is not a function of time t, but in general may a function of the time variable t).
[Ensemble] Autocorrelation: XX (t1 , t2) = E[X (t1)X (t2 )] [Ensemble] Autocovariance: XX (t1 , t2) = E[(X (t1) X (t1 ))(X (t2) X (t2))] = XX (t1, t2 ) X (t1 )X (t2 ) If X (t) has zero mean, then XX (t1 , t2) = XX (t1 , t2).
Example
X (t) = A cos(2fc t + )
where A and fc are constants. The phase is assumed to be a uniformly distributed random variable with pdf f ( ) =
1/(2 ) , 0 2 0 , elsewhere
The ensemble mean of X (t1 ) is obtained by averaging over the pdf of : X (t1 ) = E [A cos(2fct1 + )] A = cos(2fc t1 + )d 2 A = sin(2fc t1 + ) 2 = 0
Example (contd)
The autocorrelation of X (t) = A cos(2fct + ) is XX (t1, t2) = E[X (t1 )X (t2 )] = E[A2 cos(2fc t1 + ) cos(2fct2 + )] A2 A2 = E [cos(2fc t1 + 2fc t2 + 2)] + E[cos(2fc (t1 t2 )] 2 2 But E[cos(2fct1 + 2fc t2 + 2)] = 1 cos(2fct1 + 2fc t2 + 2)d 2 1 = sin(2fct1 + 2fc t2 + 2)d 4 = 0
Example (contd)
E[cos(2fc(t1 t2 )] = cos 2fc (t1 t2 )
Also, Hence, A2 cos 2fc (t1 t2 ) XX (t1, t2 ) = 2 A2 = cos 2fc , = t1 t2 2 The autocovariance of X (t) is XX (t1, t2 ) = XX (t1 , t2) X (t1)X (t2 ) = XX ( ) since X (t) = 0.
The autocorrelation function only depends on the time dierence . If a random process is strictly stationary, then it is wide sense stationary. The converse is not true. strictly stationary wide sense stationary For a Gaussian random process only strictly stationary wide sense stationary The previous example is a wide sense stationary random process.
Some Properties of XX ( )
The autocorrelation function, XX ( ), of a wide sense stationary random process X (t) satises the following properties. 1. XX (0) = E[X 2 (t)]: total power ac + dc 2. XX ( ) = XX ( ): even function 3. |XX ( )| XX (0): a variant of the Cauchy-Schwartz inequality. Proof on next slide. 4. XX () = E2 [X (t)] = 2 X : dc power, if X (t) has no periodic components. 5. If pX (t) (x) = pX (t+T )(x), i.e., the pdf of X (t) is periodic in t with period T , then XX ( ) = XX ( + T ). In other words, if pX (t) (x) is periodic in t with period T , then XX ( ) is periodic in with period T . Such a random process is said to be periodic wide sense stationary or cyclostationary. Digitally modulated waveforms are cyclostationary random processes.
Some Properties of XX ( )
The inequality |XX ( )| XX (0) can be established through the following steps. 0 = = = = Therefore, XX ( ) XX (0) |XX ( )| XX (0) . E[(X (t + ) X (t))2] E[X 2(t) + X 2 (t + ) X (t + )X (t)] E[X 2(t)] + E[X 2 (t + )] E[X (t + )X (t)] 2E[X 2(t)] E[X (t + )X (t)] 2XX (0) 2XX ( ) .