ECE 6151, Spring 2017 Lecture Notes 2: 1 Random Process
ECE 6151, Spring 2017 Lecture Notes 2: 1 Random Process
Lecture Notes 2
Shengli Zhou
January 25, 2017
Outline
The random process
Random passband signals
Basis functions and signal space
1 Random process
Questions to ask
What does it to you as stochastic process?
How to compute the autocorrelation and PSD after passing through a linear system?
Two ways of understanding R.P.
consider the random noise process X(t) (Measure the noise in the classroom)
50
40 1
30 2
20
3
10
10
20
150 200 250 300 350 400 450 500
time
autocorrelation function
RX (t1 , t2 ) = E[X(t1 )X (t2 )]
WSS
1
mX (t) = 0
RX (t1 , t2 ) depends only on the difference t1 t2
Z 2
1 2
E[X(t)X(t + )] = A cos(2fc t + ) cos(2fc (t + ) + )d
0 2
A2
= cos(2fc )
2
Hence w.s.s. with
A2 A2
SX (f ) = (f + fc ) + (f fc )
4 4
Power
Why do we define this ?
First, note that R(0) = E[|X(t)|2 ] is the variance, which we want
What will be the autocorrelation after passing a system?
second, lets consider
X(t) h(t) Y (t),
Z
Y (t) = h(t) ? X(t) = h(s)X(t s)ds
= |H(f )|2 SX (f )
2
this SX (f ) is known as a power-spectral density, and is as meaningful as RX ( )
Intuitive justification
Intuitive justification of power spectral density: take a bandpass filter of width f .
1
H(f)
Xt Yt f f+f
h(t)
E[|Y (t)|2 ] f SX (f )
White noise
Special situation: white noise
N0
SX (f ) = 2
N0
RX ( ) = 2 ( )
X(t) Y(t)
H(f)
S_x(f) S_y(f)
H(f)
S_x(f) S_y(f)
What are the statistical properties between X(t) and Xi (t), Xq (t), Xl (t)?
RX ( ) = 21 Re RXl ( )ej2fc
3
Additional property
1
SX (f ) = 4 [SXl (f f0 ) + SXl (f f0 )]
N (t)
N0
SN (f ) = , |f fc | B/2
2
SNi (f ) = N0 , |f | B/2
sin(B )
RNi ( ) = N0 = N0 Bsinc(B )
B
RNi (0) = N0 B
4
3 Discrete Representation of Continuous Signals
Given an orthonormal set {k (t)}kI , the best approximation of s(t) is
X
s(t) = < s, > k (t)
k
R
such that the Ee =
|s(t) s(t)|2 dt is minimized.
Scalar product Z
< x, y >= x(t)y (t)dt
is minimized
Optimal projections Z
sk =< s, k >= s(t)k (t)dt
X
(Ee )min = Es |sk |2
k
X 1 k
s(t) = ck ej2 T t , t [0, T ]
T
Z T
1 k
ck =< x, k >= x(t) ej2 T t
0 T
5
Ex 2: Fourier Transform Z
X(f ) = x(t)ej2f t dt
Z
x(t) = X(f )ej2f t dt
So
sin(2W (t 2Wn
))
2W n is a orthonormal set
2W (t 2W )
Whiteboard pictures
6
7
Gram-Schmidt Orthogonalization
How do we find {k (t)} in general ?
(i) s
Z T
1 (t) = s1 (t)/ s21 (t)dt
0
(ii)
g2 (t) = s2 (t) s21 1 (t)
s
Z T
2 (t) = g2 (t)/ g22 (t)dt
0
(iii)
g3 (t) = s3 (t) s31 1 (t) s32 2 (t)
s
Z T
3 (t) = g3 (t)/ g32 (t)dt
0
so on and so forth
2 b(t)
s1(t) s2(t) f1(t) 1
1
1
1 2 3 1
1 2 3 1 2 3
s4(t)
s3(t) 2 f2(t) f3(t)
1 1 1 1
1 2 3 1 2 3 1 2 3 1 2 3
-3/4 -1
2. step 2: m = 2
Z
Sm1 = s2 (t)1 (t) = 1
sm (t) = 1 (t)
e(t) = sm (t) sm (t) = b(t 1)
8
5. step 3: N = 3 3 (t) = e(t)/||e(t)|| = b(t 2)
6. step 2: m = 4, ..., e(t) = 0
7. step 3: since e(t) = 0, go to step 2.
8. step 2: m = 5 > M , stop. Output N = 3 and 1 (t), 2 (t), and 3 (t).