Random Variables
Random Variables
Textbook: 2
2.1
1-2
2.6,
6 55.1
1-5
5.3
3
1
Signal and Noise in
Communication Systems
In communication systems,
y , the received waveform is usuallyy
categorized into the desired part containing the information and the
extraneous or undesired part. The desired part is called the signal, and
the undesired part is called noise.
noise
Noise is one of the most critical and fundamental concepts affecting
communication systems
The entire subject of communication systems is all about methods to
overcome the distorting or bad effects of noise
To do so, understanding random variables and random processes
becomes quite essential
2.1.Signals
Axioms
A i off P
Probability
b bilit
1)
2)
3) Let A and B are two mutually exclusive events, i.e.
Then
If then
then and
2009/2010 Meixia Tao @ SJTU 11
Law of Total Probability
Let be mutually exclusive events
with
A r.v.
r v may be
Discrete-valued: range is finite (e.g. {0,1}), or countable
infinite (e.g.
(e g {1,2,3
{1 2 3 …})
})
Continuous-valued: range is uncountable infinite (e.g. )
16
Probability Density Function (PDF)
The PDF, of a r.v. X, is defined as
Δ
f X ( x ) = FX ( x ) FX ( x ) = ∫ f X ( y )dy
d x
or
dx −∞
Area = P ( x1 < X ≤ x2 )
FX ( x ) f X (x )
1 1
Area = f X ( x )dx
P( x1 < X ≤ x2 )
∂ 2 FXY ( x, y )
and joint PDF is f XY ( x, y ) =
∂x∂y
Key properties of joint distribution
∞ ∞
∫ ∫
−∞ −∞
p XY ( x, y )dxdy = 1
y2 x2
P( x1 < X ≤ x2 , y1 < Y ≤ y2 ) = ∫ ∫ p XY ( x, y )dxdy
y1 x1
y ∞
PY ( y ) = ∫ ∫ p XY (α , β )dαdβ
−∞ −∞
Marginal density
∞
p X ( x) = ∫ p XY ( x, β )dβ
−∞
−∞
−∞
F
For the
th special (X) =xn, we obtain
i l case off g(X) the nth
bt i th
moment of X, that is
EX[ ]= ∫
n
∞
−∞
x n p X ( x )dx
Let n = 2,
2 we have the mean
mean-square
square value of X as
EX[ ]= ∫
2
∞
−∞
x 2 p X ( x )dx
RXY = E [ XY ] = ∫
∞ ∞
−∞ −∞ ∫ xyf XY ( x, y )dxdy
R XY = E[ XY ] = 0
24
Some Useful Probability Distributions
Discrete Distribution
Binary distribution
Binomial distribution
Continuous Distribution
Uniform distribution
Gaussian distribution (most important one)
Rayleigh distribution (very important in mobile and
wireless
i l communications)
i ti )
This is frequently
y used to model binary
y data
Mean:
Variance
Then
Th
where
That is, the probability that Y = k is the probability that
k of the Xi are equal
q to 1 and n-k are equal
q to 0
Mean:
Variance:
V i
27
Example
Suppose that we transmit a 31-bit
31 bit long sequence with
error correction capability up to 3 bit errors
If the probability of a bit error is p = 0.001,
0 001 what is the
probability that this sequence is received in errors?
The
Th pdf
df off uniform
if distribution
di t ib ti iis given
i b
by
Any example?
x
0 mX
2009/2010 Meixia Tao @ SJTU 30
The Q
Q-Function
Function
The Q-function is a standard form to express error
probabilities without
itho t a closed form
∞ 1 ⎛ u2 ⎞
Q( x ) = ∫ exp⎜⎜ − ⎟⎟du
x 2π ⎝ 2⎠
The Q-function is the area under the tail of a Gaussian p
pdf
with mean zero and variance one
Upper
U b
bound
d
x is a column vector
m is the vector of the means
C is the covariance matrix
Then
0.7
0.6
0.5
0.4
0.3
0.2
0.1
0
0 1 2 3 4 5 6
A
And
d assume th
thatt Xi’s
’ are uncorrelated
l t d with
ith th
the same
mean and variance
Then
So what?
Th
Then as , the
th distribution
di t ib ti off Y will
ill ttend
d ttowards
d a
Gaussian distribution
46
xn(t) X(t1) X(t2) Outcome of nth
experiment
t1 t2
2009/2010 Meixia Tao @ SJTU
Statistics of Random Processes
By sampling the random process at any time
time, we
get a random variable
F
From this
thi view
i point,
i t we can thi think
k off a random
d
process as an infinite collection of random
variables
i bl specified
ifi d att titime t:
t {X(t1),
) X(t2),) …, X(tn)}
Thus,, a random process
p can be completely
p y
defined statistically as a collection of random
variables indexed byy time with p properties
p defined
by a joint PDF
48
First Order Statistics on Random
P
Processes
The first order statistics is simply the PDF of a
random variable at one particular time
f(x;t) = first order density of X(t)
F(x;t) = P(X(t) ≤x), first order distribution of X(t)
E [ X (t0 )] = E [ X (t = t0 )] = ∫ xf X ( x; t0 ) = X (t0 )
∞
Mean −∞
Variance [ 2
]
E X (t0 ) − X (t0 ) = σ X2 (t0 )
RX (t;τ ) = E[ X (t ) X (t + τ )] = ∫
∞ ∞
−∞ −∞ ∫ x1 x2 f ( x1 , x2 ; t , t + τ )dx1dx2
Therefore,
Te first-order statistics is independent of t
∞
mean E { X (t )} = ∫ xf X ( x)dx = mX ((2))
−∞
−∞
Δ
RX (t1 , t 2 ) = E [X (t1 ) X (t 2 )] = ∫
∞ ∞
−∞ −∞ ∫ x1 x2 p ( x1 , x2 ; t1 , t 2 )dx1dx2
Time averaging
g g Δ 1 T
T →∞ 2T ∫−T
< X (t ) > = lim x(t )dt
Δ 1 T
T →∞ 2T ∫−T
< X (t ) X (t − τ ) > = lim
li x(t ) x(t − τ )dt
: energy
gy spectral
p density
y
(4)
Watts/Hz
−∞
∞
= ∫ h(τ ) X (t − τ )dτ
−∞
If X(t) is WSS
∞
= X ∫ h(τ )dτ = X ⋅ H (0)
−∞
⎢⎣ −∞ −∞ ⎥⎦
= ∫ h(τ 1 )dτ 1 ∫ h(τ 2 ) E [X (t − τ 1 ) X (u − τ 2 )]dτ 2
∞
−∞
If X(t) is WSS
∞ ∞
RY (τ ) = ∫ ∫ h(τ 1 )h(τ 2 ) RX (τ − τ 1 + τ 2 )dτ 1dτ 2
−∞ −∞
Autocorrelation of Y(t)
∞ ∞
RY (τ ) = ∫ ∫ h(τ 1 )h(τ 2 ) RX (τ − τ 1 + τ 2 )dτ 1dτ 2
−∞ −∞
∞ ∞
= ∫ h(τ 2 )dτ 2 ∫ h(τ 1 ) RX (τ + τ 2 − τ 1 )dτ 1
−∞ −∞
−∞
= h(−τ ) * h(τ ) * RX (τ )
PSD of Y(t): SY ( f ) = H ( f ) 2 S X ( f )
X(t) Y(t)
h(t)
SX ( f ) SY ( f )
2
SY ( f ) = H ( f ) S X ( f ) Key
ey Results
esu ts
0 f 0 τ
2009/2010 Meixia Tao @ SJTU 63
Bandlimited Noise
White noise Bandlimited white
Filter Bandwidth noise n(t)
B Hz
In most applications
N 0 = KT = 4.14 × 10−21
= −174 dBm/Hz
− fc f c
Find
Fi d the t ti ti off nc(t)
th statistics ( ) and
d ns(t)
()
Result 1:
E {n(t )} = E {nc (t )} = E {ns (t )} = 0
Proof:
E [ n(t ) ] = E [ nc (t ) ] cos ω0t − E [ ns (t ) ] sin ω0t
Proof HL ( f )
nc (t )
× Z (t)
1
1
-B/2 0 B/2 f
2cos ω0t
n(t )
−2sin ω0t
HL ( f )
ns (t )
× Z2 (t ) 1
-B/2 0 B/2 f
Result 4:
E {n 2 (t )} = E {nc 2 (t )} = E {ns 2 (t )} = σ 2
R(t ) = nc 2 (t ) + ns 2 (t ) envelop
where ns (t )
φ (t ) = tan −1 [0 ≤ φ (t ) ≤ 2π ] phase
h
nc (t )
1 R(t )
≈
n(t ) B
0 t
1
≈
f0