0% found this document useful (0 votes)
119 views25 pages

Modern Digital Communication: Statistical Averages

This document discusses statistical averages of random variables and stochastic processes. It defines key terms like mean, variance, covariance, and correlation for random variables. It also discusses how the characteristic function can be used to determine the probability density function of independent random variables and the moments of a random variable. Finally, it provides an overview of stochastic processes and defines stationary stochastic processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
119 views25 pages

Modern Digital Communication: Statistical Averages

This document discusses statistical averages of random variables and stochastic processes. It defines key terms like mean, variance, covariance, and correlation for random variables. It also discusses how the characteristic function can be used to determine the probability density function of independent random variables and the moments of a random variable. Finally, it provides an overview of stochastic processes and defines stationary stochastic processes.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

MODERN DIGITAL COMMUNICATION

Statistical Averages

- Dr. P. Susheelkumar S,
- Faculty – Dept. of Electronics Engineering
- Datta Meghe College of Engineering
- Airoli, Navi Mumbai
Statistical Averages of Random Variables:
We know:
Statistical Averages of Random Variables:
Statistical Averages of Random Variables:

 2

 x2   x  m  p( x)dx

x

  x 

 2 xmx  mx p( x)dx
2 2


  
  x 2 p( x)dx  2mx  xp( x)dx  mx  1 p( x)dx
2

  

 E ( X 2 )  2mx mx  mx E (1)
2

 E ( X 2 )  2mx  mx
2 2

 E ( X )  mx  E ( X 2 )  [ E ( X )]2
2 2
Statistical Averages of Random Variables:
Statistical Averages of Random Variables:
And the Joint Moment and Central Joint Moment
corresponding to (k,n )= (1,1) are given as:
1) Correlation between var iables X i , X j is exp ressed as :
 
E( X i X j )    x x p( x x )dx dx

i j i j i j

1) Co var iance of var iables X i , X j is exp ressed as :


i. j  E[( X i  mi )( X j  m j )
 
   ( x  m )(x
 
i i j  m j ) p( xi x j )dxi dx j
Statistical Averages of Random Variables:
 
   (x x
  
i j  mi x j  m j xi  mi m j ) p( xi x j )dxi dx j

   
   ( x x ) p( x x )dx dx    (m x ) p( x x )dx dx
  
i j i j i j
  
i j i j i j 

   

  (m x ) p( x x )dx dx    (m m ) p( x x )dx dx
  
j i i j i j
  
i j i j i j


 
 E ( X i , X j )   (mi x j )   p ( xi x j )dxi dx j 
   

   

(m j xi ) p( xi x j )dx j dxi  (mi m j ) p( xi x j )dxi dx j


Statistical Averages of Random Variables:

 E( X i , X j ) 

 (m x i j ) p ( x j )dx j 

 (m

j xi ) p ( xi )dxi  (mi m j )[1]


 E ( X i , X j )  mi x

j p ( x j )dx j 


mj  x p( x )dx

i i i  ( mi m j )

 E ( X i , X j )  mi m j  m j mi  mi m j
 E ( X i , X j )  mi m j  mi m j  mi m j
 E ( X i , X j )  mi m j
Statistical Averages of Random Variables:
Co var iance
i. j  E[( X i  mi )( X j  m j )]
 
   ( x  m )(x
  
i i j  m j ) p( xi x j )dxi dx j

 E( X i , X j )  mi m j
The n*n matrix with elements i. j is called the co-
variance matrix of the random variables Xi , i=1,2,3,….,n.

Two random variable Xi, Xj are said to be uncorrelated if:


E( X i , X j )  E( X i ) E( X j )  mi m j
Statistical Averages of Random Variables:
Statistical Averages of Random Variables:

The relation between Char.function & moments of the


random variable (first moment or mean)
The first derivative of Char. Function is expressed as:


d ( jv)
 j  xe p( x)dx
jvx

dv 

d ( jv)
v   j  x p( x)dx  jE ( X )
dv 

d ( jv)
 E( X )   j v 
dv
Statistical Averages of Random Variables:

d  ( jv)
n

 n
 j E ( X ) |v0.
n n

dv
The Characteristic function expanded using Taylor’s series
about the point v=0 is expressed as:
Statistical Averages of Random Variables:

 n
v
  ( jv)   n
j E( X ) n

n 0 n!
 n n  n
j v n ( jv)
  ( jv)   E( X )n
  E( X )
n 0 n! n 0 n!
The Characteristic function can be used for determining the
pdf of a sum of statistically independent random variables:
n
Let:
Y   Xi
i 1
Statistical Averages of Random Variables:

Since the random variables are statistically independent we


have: p (a, b, c,...., n)  p (a ). p (b). p (c)..... p (n)
   n
  Y ( jv)      ) p( x1 ) p( x2 )... p( xn )dx1dx2 ...dxn
jvxi
..... ( e
    i 1
  
  Y ( jv)   1   p( xn )dxn
jvx1 jvx2 jvxn
e p ( x1 ) dx e p ( x2 ) dx2 .... e
  
Statistical Averages of Random Variables:
  

  Y ( jv)   e jvx p( x1 )dx1  e jvx p( x2 )dx2 ....  e jvx p( xn )dxn


1 2 n

  

  Y ( jv)   X ( jv). X ( jv)..... X ( jv)


1 2 n

n
  Y ( jv)   Xi ( jv)
i 1

And if all therandomvar iables


are identically distrubuted , then we get
Statistical Averages of Random Variables:

If Xi are the random variables with pdf p(x1,x2,x3,…,xn), then


the n-dimensional Char function is defined as:
n
j  vi X i
 ( jv1 , jv2 , jv3 ,...., jvn )  E[e i 1
]
n
   j  vi X i
   ....  e
   
i 1
. p( x1 ,x2 ,...., xn )dx1dx2 ...dxn

But for a two-dimensional Char. Function it is expressed as:


 

 ( jv1 , jv2 )    ....e


( jv1 x1  jv2 x2 )
p( x1 , x2 )dx1dx2

Statistical Averages of Random Variables:

Also, partial derivative of w.r.t. v1 and v2 is used


to generate joint moments and can be expressed as:

Stochastic Processes:
Stochasic Processesses are any random process, that may be
viewed as any random variable with parameter t and can be
represented by X(t).

In general parameter t is continuous but X may be either


coninuous or discrete depending on the characteristics of the
source that generates the Stochastic Process.
Statistical Averages of Random Variables:
Eg: The noise voltage generated by a single resistor or a single
information source represents a single realization of a
Stochastic process, which may be called a sample function of
the stochastic process.

The Set of all possible sample functions i.e. the set of all noise
voltage waveforms generated by resistors constitute an
ensemble (average) of sample functions also called
equivalently as a Stochastic Process X(t).

We may consider the values of the process at any time instant


ti: t1>t2>t3>…..>tn where n is any positive integer.

In general the random variable Xti=X(ti), i = 1,2,3,…,n are


characterised statistically by their joint pdf p (xt1, xt2,….., xtn)
Stationary Stochastic Process:
For any stochastic Process X(t), the random variables Xti,
i=1,2,… , n at time instants t1>t2>…>tn are characterised
statistically by their joint pdf p(xt1, xt2,….., xtn)

Now , if we consider another set of n random variables


Xti+t=X(ti+t), i=1,2,…,n and t is any arbitrary time shift , then
these random variables are characterised by their joint pdf
p(xt1+t, xt2+t,….., xtn+t)

Now these pdfs i.e. p(xt1, xt2,….., xtn) and p(xt1+t, xt2+t,….., xtn+t)
may or may not be equal

But, when they are identical, i.e. when p(xt1, xt2,….., xtn) =
p(xt1+t, xt2+t,….., xtn+t) for all t and n, the stochastic process is
said to be Stationary in Strict Sense
Statistical Averages for Stochastic Process:

For two random variables Xti=X(ti), i=1,2 then the correlation


between Xt1 and Xt2 is given by the joint moment as:

 

 (t1 , t2 )  E ( X t X t ) 
1 2  x t1
xt p( xt xt )dxt dxt
2 1 2 1 2
 

 (t1 , t2 ) is the autocorrelation function of the Stochastic Pr ocess


Statistical Averages for Stochastic Process:
But, when the process X(t) is stationary , the joint pdf for the
pair (Xt1,Xt2) is identical to the pair (Xt1+t,Xt2+t) for any arbitrary
t.
Thus for a Stationary Stochastic Process, autocorrelation does
not depend on specific time instants t1 and t2 but on the time
difference i.e. t1 - t2 . Thus for a stationary stochastic process
the joint moment is expressed as:

E ( X t X t )   (t1 , t2 )   (t1  t2 )   ( )
1 2

Thus  ( ) is an even function and also  (0)  E ( X t )


2

denotes the average power in the process X(t)


Statistical Averages for Stochastic Process:
Non-Stationary Processes with the property that the mean
value of the process is independent of time i,.e a constant and
where the auto-correlation function satisfies the condition
that  (t1 , t2 )   (t1  t2 ) are called Wide Sense Sationary
Process.
The Auto-Covariance function of a Stochastic Process is
defined as:

Where m(t1) and m(t2) are the means of Xt1 and Xt2
respectively

And when the Process is stationary, the auto-covariance


function simplifies to:
Averages for Joint Stochastic Process:
Let X(t) and Y(t) denote two stochastic processes and let Xti
=X(ti), i=1,2,…,n and Yt’j = Y(t’j), j=1,2,…,m represent
random variables at times t1>t2>….>tn and t’1>t’2>…..>t’m
respectively.

The two Processes are characterized statistically by their joint


pdfs:
Averages for Joint Stochastic Process:
And when the Processes are jointly and individually stationary,
we have:

And in this case we note that:


xy ( )  E ( X t Yt  )  E ( X t  Yt )   yx ( )
1 1
'
1
'
1

The stochastic processes X(t) and Y(t) are said to be


statistically independent if only if :
p( xt , xt ,..., xt , yt , yt ,..., yt )  p( xt , xt ,..., xt ) p( yt , yt ,..., yt )
1 2 n
'
1
'
2
'
m 1 2 n
'
1
'
2
'
m

for all choices of ti and t’i and for all positive integers n and m.

The processes are uncorrelated if:


Averages for Joint Stochastic Process:
Let us consider a Complex valued Stochastic process Z(t)
defined as:

Therefore, the Auto correlation function for the above process


is given as:

And ½ is used as a convenient normalization factor

You might also like