Chapter 3 - Section3 - 1
Chapter 3 - Section3 - 1
Consider a random binary signal, X(t), where on the time interval (n-1)Tb < t < nTb, the
amplitude of the signal is A if the nth bit (bn) is 1 and –A if bn is 0. On the nth signaling interval,
P(bn = 1) = P(bn = 0) = 0.5. Different realizations of the signal, X(t,i), where i = the ith
realization of the signal, are shown in Figure 3.1.
Figure 3.1 Five realizations of statistically identical binary waveforms with the average of the 5
realizations.
The signal Xavg(t) in Figure 3.1 is the average of the five realizations of the random binary
signals. As we create more realizations and average them over the set of the realizations, the
signal Xavg(t) would tend towards 0. The statistical average of X(t) is 0 for all t, since at any time
t, is has a 50% chance of being A and a 50% chance of being –A.
The waveforms in Figure 3.1 were generated using Matlab’s random number generator. This
was used to produce ten random bits and then those bits were used to produce a signal. Each
waveform produced, X(t,i) is called a sample function, where zi is a member of a sample set S
that has a sample function associated with it. The totality of all sample functions is called the
ensemble. In Figure 3.1, we show five sample functions but there are 210 = 1024 possible signals
in the ensemble (unless we introduce random delay into the mix and then there are an infinite
number of possible sample signals).
3.1.2 Descriptions of Random Processes in Terms of Joint PDFs
Consider the random signal X(t,) sampled at time t1. Let us denote this random variable as X1.
We can describe this sample with the pdf fX1(x1,t1) as
( ) ( ) (3.1)
Similarly, if X2 = X(t2,) then we can describe X(t,) by the joint pdf ( ) which
can be interpreted as:
( )
( ) (3.2)
Describing a signal by a number of samples using joint pdfs to describe the random signal can
become mathematically intractable if the number of samples is high or the duration of the signal
is long.
3.1.3 Stationarity
In some cases, the joint statistics of samples at times t1, t2, t3 etc, depend only on the time
difference, t3-t1, t3-t2, t2-t1 etc and that the choice of time origin of the process is immaterial
(means of the process at time = t are independent of t). Such random processes are called
statistically stationary in the strict sense, or simply stationary.
For stationary professes, means and variances are independent of time and the covariance
between the process sampled at time t1 and the sample at time t2 depends only on t2-t1 (or t1-t2).
Although for a stationary process, all moments E[Xn] are independent of time, we are particularly
interested in processes that are wide sense stationary (WSS). A WSS process has the following
properties:
1) E[X(t)] = X = constant
2) E[X2(t)]-(E[X(t)])2 = X2 = constant
3) E[X(t1)X(t2)] = fn(t2-t1).
A process that is stationary in the strict sense is wide sense stationary while wide sense
stationarity does not necessarily imply stationarity in the strict sense.
The mean of a signal can be computed using a time average. If X(t) is a signal, its time averaged
mean is
̅( ) ∫ ( ) (3.3)
whereas its statistical mean is taken across the ensemble and is given by E[X(t)]. A random
process is mean-ergodic if the time averaged mean and the statistical mean are equal, i.e.
̅( ) ( ) . The same applies to all moments and the time average autocorrelation
function. In other words, a random process is ergodic in autocorrelation if
̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅̅
( ) ( ) ∫ ( ) ( ) ( ) ( ) (3.4)
For a process that is ergodic, E[X2(t)] would be equal to the time average of X2(t) which is given
by:
̅̅̅̅̅̅̅
( ) ∫ ( ) (3.5)
In the case of ergodic random signals, these statistical averages can be measured in the sense that
they can be replaced by the corresponding time averages and a finite-time approximation to these
time averages can be measure in the laboratory.
( ) ( ) ( ) (3.6)
We saw in subsection 3.1.3 that a stationary or a wide sense stationary process has an
autocorrelation function that depends only on the time difference between t1 and t2. Therefore
for WSS processes, we can express (3.6) as
( ) ( ) ( ) (3.7)
Where t = t2-t1 in (3.6) and t1 and t2 can take on any value. If (3.6) depends on t1 and t2 in ways
other than t2-t1, then the process is not stationary and therefore cannot be ergodic since this
implies that the statistics of X(t) are time-varying while time averages are independent of time.
Example 3.1
Solution
(a) ̅̅̅̅̅̅
( )
( )
∫ ( ) |
( ) ( )
(b) ( ) ∫ |
( ) ( )
( )
(c) ( ) ∫ ∫ ( (
))
(d)
(e) ( ) ( ) ( ) ( )
[ ( ( ))] [ ( ( ) )]
( ( )) ( )
Exercise 3.1
( )
( ) ∑ ( ) (3.8)
Where (t) is the rectangular pulse, bn is the bit (0 or 1) to be transmitted in the nth signaling
interval, tD is a delay that is uniformly distributed on the interval 0 < t < T, and
{ (3.9)
We assume that P(bn = 1) = P(bn= 0) = 0.5 and that bn and bm are independent if n ≠ m.
We wish to find the autocorrelation function of X(t) given by (3.8). We note the following:
( ) ( )( ) (3.10)
and
{ (3.11)
To determine E[X(t1)X(t2)], the reader is referred to Figure 3.3 where Figure 3.3(a) shows the
case where t1=t2, Figure 3.3(b) shows the case where |t2-t1| > T and Figure 3.3(c) shows the case
where |t2-t1|<T.
In Figure 3.3, we can see that when t1=t2, E[X(t1)X(t2)] = E[An2] = A2. When |t2-t1| > T, then
E[X(t1)X(t2)] = E[AnAm], where n ≠ m; therefore, E[X(t1)X(t2)] = 0. Figure 3.3(c) shows us that
the value of E[X(t1)X(t2)] depends on the value of tD. To determine E[X(t1)X(t2)] when |t2-t1| < T,
we refer the reader to Figure 3.4. In Figure 3.4 (a), 0 < tD < t1, and E[X(t1)X(t2)] = E[An2] = A2.
In Figure 3.4(b), t1 < tD < t2, and E[X(t1)X(t2)] = E[AnAn-1] = 0. In Figure 3.4(c), t2 < tD < T and
E[X(t1)X(t2)] = E[ ] = A2 .
Figure 3.3 (a) t1 = t2, (b) t2-t1 > T, (c) t2-t1 < T.
Therefore for 0 < t2 – t1 < T, the probability that t1 and t2 fall on the same bit is
( ) ∫ ∫ (3.12)
For the case where t2 < t1 and 0 < t1-t2 < T, then we can show that
( ) ∫ ∫ (3.13)
There are other scenarios to consider, such as when t1 < nT and t2 > nT, yet t2-t1 < T. Following
the same procedure as above, we determine that the probability that t1 and t2 fall in the same bit
is the same.
Therefore the autocorrelation function of the random antipodal binary signal employing
rectangular pulses given by (3.8) is:
( )
( ) ( ) ( ) { (3.14)
( )
In ELG 3175, students were introduced to the time-averaged autocorrelation function. For a
WSS process, X(t), the time-averaged autocorrelation function can now be replaced by the
statistically averaged autocorrelation function of the previous section. The properties that
applied to the time-averaged autocorrelation function can now be applied to the statistically
averaged one.
Property 1
The first property of the autocorrelation function is that |RX()| ≤ RX(0). Since the autocorrelation
function is a measure of similarity, it stands to reason that X(t) is more similar to itself than it is
to a delayed version of itself.
Consider the expression E[(X(t) ± X(t+))2] ≥ 0. Expanding the square in the expectation, we
get:
( ) ( ) ( ) ( ) (3.16)
Since X(t) is WSS, then the first and last term of (3.16) are equal. Therefore (3.16) becomes
( ) ( ) ( ) ( ) (3.17)
or
( ) ( ) ( ) (3.18)
Property 2
The autocorrelation function RX() = E[X(t)X(t+)] and RX(-) = E[X(t)X(t-)]. If we replace t-
by t’ in the RX(-), we get RX(-) = E[X(t’+)X(t’)] = E[X(t’)X(t’+)] = RX().
Property 3
If there are no periodic components in X(t), then as approaches infinity, the interdependence
between X(t) and X(t+) diminishes to the point that X(t) and X(t+) are independent. Therefore
( ) ( ) ( ) ( ) ( ) ( ) (3.19)
Exercise 3.2
X(t) is a random binary signal given by (3.8) except that An is given by:
{ .
In ELG3175, students learned that the power spectral density of a signal is the Fourier transform
of its time-averaged autocorrelation function. If X(t) is ergodic in autocorrelation, then its
statistically averaged autocorrelation function is equal to its time averaged autocorrelation
function.
( ) ∫ ( ) (3.20)
Example 3.2
What is the power spectral density of the random binary signal of (3.8)?
Solution
SX(f) = { ( )} ( ) (3.21)