Lect 03
Lect 03
Random Vectors
• Let X1, X2, . . . , Xn be random variables defined on the same probability space.
We define a random vector (RV) as
X1
X2
X= ..
Xn
• The marginals can be obtained from the joint in the usual way. For the previous
example,
FX1 (x1) = lim FX(x1, x2, x3 )
x2 ,x3 →∞
Z ∞
fX1,X2 (x1, x2) = fX1,X2,X3 (x1, x2, x3) dx3
−∞
• Independence is defined in the usual way; e.g., X1, X2, . . . , Xn are independent
if n
Y
fX(x) = fXi (xi) for all (x1, . . . , xn)
i=1
• Important special case, i.i.d. r.v.s: X1, X2, . . . , Xn are said to be independent,
identically distributed (i.i.d.) if they are independent and have the same
marginals
Example: if we flip a coin n times independently, we generate i.i.d. Bern(p)
r.v.s. X1, X2, . . . , Xn
• R.v.s X1 and X3 are said to be conditionally independent given X2 if
fX1,X3|X2 (x1 , x3|x2) = fX1|X2 (x1|x2)fX3|X2 (x3|x2) for all (x1 , x2, x3)
1 0 0 1 2 1 1 0 1
1. 0 1 0 2. 2 1 1 3. 1 2 1
0 0 1 1 1 1 0 1 3
−1 1 1 1 1 1 1 2 3
4. 1 1 1 5. 1 2 1 6. 2 4 6
1 1 1 1 1 3 3 6 9
• Example: Let
2 1
Σ=
1 3
To find the eigenvalues of Σ, we find the roots of the polynomial equation
det(Σ − λI) = λ2 − 5λ + 5 = 0,
which gives λ1 = 3.62, λ2 = 1.38
To find the eigenvectors, consider
2 1 u11 u
= 3.62 11 ,
1 3 u12 u12
Z = Λ1/2X Y = UZ
=⇒ =⇒
x1 z1 y1
⇐= ⇐=
X = Λ−1/2Z Z = UT Y
X Σ1/2 Y
ΣX = I ΣY = Σ
◦ Whitening:
Y Σ−1/2 X
ΣY = Σ ΣX = I
◦ Lower triangle square root and its inverse can be efficiently computed using
Cholesky decomposition
• Example: Let
2 1
X ∼ N 0,
1 3
Find the joint pdf of
1 1
Y= X
1 0
Of course this is not sufficient to show that Y is a GRV — we must also show
that the joint pdf has the right form
We do so using the characteristic function for a random vector
= ΦX(AT ω)
1 T T T T
−
=e 2 (A ω) Σ(A ω) + iω Aµ
1 T T T
=e− 2 ω (AΣA )ω + iω Aµ
• The proof of Property 4 follows from properties 1 and 2 and the orthogonality
principle (HW exercise)