0% found this document useful (0 votes)
55 views

Probability and Random Processes 2023

1) Probability and random processes involve assigning probabilities to outcomes of random experiments and defining random variables and their distributions over time. 2) Key concepts include probability, random variables, probability distributions like the probability mass function and probability density function, independent and conditional probabilities, and random processes. 3) Random processes assign a time function to every outcome of a random experiment and are classified based on whether the parameter set and state space are discrete or continuous. Statistical averages of random processes include the mean and autocorrelation function.

Uploaded by

Samuel mutinda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views

Probability and Random Processes 2023

1) Probability and random processes involve assigning probabilities to outcomes of random experiments and defining random variables and their distributions over time. 2) Key concepts include probability, random variables, probability distributions like the probability mass function and probability density function, independent and conditional probabilities, and random processes. 3) Random processes assign a time function to every outcome of a random experiment and are classified based on whether the parameter set and state space are discrete or continuous. Statistical averages of random processes include the mean and autocorrelation function.

Uploaded by

Samuel mutinda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Probability and Random

processes
Probability and Random processes Notation

•X.Y random events


•P(…) –Probability of an event
•𝑝𝑋 (x) -Distribution function of X
•𝑝𝑋,𝑌 (x,y) Joint distribution function
•𝐹𝑋 (x)- Cumulative distribution function of X,P(X≤x)
•E(X)-Expectation of X
•𝑢𝑋 -Mean of X
•VAR(X) ,𝜎𝑋 2 - Variance of X
•COV(X,Y)- Covariance of X and Y
Probability Theory Review
•P(event)= number of favorable outcomes/number of all
possible outcomes
•Probability measure P(A) is assigned to an event A.
•It is a value between 0 and 1 that shows how an event is likely
to occur.
•If P(A) is close to 0, it is very unlikely that the
event A occurs.
•If P(A) is close to 1, A is very likely to occur.
Probability
•Sample space is denoted with S={ }
•An event is an outcome or a collection of outcomes.
•For a roll of a dice, the sample space is S = {1,2,3,4,5,6}
•The event might be that the number on top is an even number
•The event can be called Event A
•A = {2,4,6}.
•The event is a subset of S.
Probability
•The probability of an event A, denoted P(A), specifies the chance or
likelihood that A will occur.
• The numerical value of the probability will indicate how likely or
unlikely that event is.
•For any event A, 0 ≤ P(A) ≤ 1
•The complete set of probabilities for a process add up to 1.
•P(S) = 1)
•A probability model describes all possible outcomes and assigns
probabilities to them.
Compound events
A compound event is an event with two or more
favorable outcomes.
There are two types of compound events that
is mutually exclusive compound events and
mutually inclusive compound events.
A mutually exclusive compound event is when
two events cannot happen at the same time.
Disjoint or mutually exclusive events
•Disjoint events are events that never occur at the same time.
•Such events can be visually represented by a Venn diagram, as shown
below.

• In the above diagram, there is no overlap between event A and event B.


•These two events never occur together, they are mutually exclusive
events.
Independent events

•Independent events are unrelated events, that


is the outcome of one event does not affect the
outcome of the other event.
•Independent events can occur together.
•For example, the marks you score does not
depend on what your classmate scored
Conditional probability
•Conditional probability is a measure of
the probability of an event occurring, given that
another event has already occurred.
•The conditional probability of A given B is
written as P(A|B) and is given by

P(A|B)= P(A and B)/P(B)


Rules of probability
Addition rule
P(A or B)= P(A) + P(B)- P(A and B)

If the events are mutually exclusive


P(A and B)=0
Thus P(A or B)= P(A) + P(B)
Rules of probability
•Multiplication rule
•P(A|B)= P(A and B)/P(B)
•P(A and B)=P(A|B) x P(B)
•P(A and B)=P(B|A) x P(A)
•For independent events
•P(A|B)= P(A) and P(B|A)= P(B)
• therefore P(A and B)=P(B) x P(A)
Example
•Mary is to chose the units she will take in the coming
semester.
•The probability that she enrolls for analogue electronics is 0.4
, the probability that she enrolls for Biomechanics is 0.7
•The probability that she enrolls for analogue electronics given
that she has enrolled for Biomechanics is 0.3.
•.
Example

i.Calculate the probability that she will enroll for


both analogue electronics and biomechanics
ii.Calculate the probability that she will enroll
either analogue electronics or biomechanics.
iii.Are the two events mutually exclusive?
iv.Are they independent?
Solution.
i. P(A and B)=P(A|B) x P(B)
– P(A)= 0.4
– P(B)=0.6
– P(A|B)=0.3
P(A and B)=0.3 x 0.6=0.18
ii. P(A or B)=P(A) + P(B)- P(A and B)
P(A or B)=0.4 +0.6-0.28=0.72
iii. No. For Mutually exclusive events
P(A and B)=0
iv. No. For independent events
P(A and B)=P(B) x P(A)
Bayes theorem
• For independent events
• P(A|B)=( P(B/A) x P(A))/P(B)
• Where A,B are the events
• P(A|B) is the probability of A given B has
occurred
• P(B/A) is the probability of B given A has
occurred.
• P(A) ,P(B) independent probabilities of A and B
Probability Distributions

• A probability distribution is a statistical


function that describes all the possible values
and likelihoods that a random variable can take
within a given range.
•One of the most common distribution is the
normal distribution.
Random processes
Random processes
•A random process is a phenomenon that varies unpredictably as time goes
on.
•A random variable is a rule or function that assigns a real number to every
outcome of a random experiment
•A random process assigns a time function to every outcome of a random
experiment
•A random process is a collection or ensemble of random variables x(s,t) that
are functions of a real variable time t where s ∈S(sample space) and
t∈ T is the index set

Classification of random processes
•Depending on continuous or discrete nature of the state space S
and the parameter set T
•If both are discrete the random process is called a discrete
random sequence
•Example the outcome of the nth toss of a fair die
•{ Xn, n≥ 1} is a discrete random sequence

•T={1,2,3,……}
•S=[1,2,3,4,5,6]


Classification of random processes
•If T is discrete and S is continuous the random process is called a continuous random

sequence Eg Temperature at the end of every nth hour of the day

•{ Xn,1 ≤n≤ 24}

•The temperature can take any value

•If T is continuous and S is discrete the random process is called a discrete random

process

•Eg X(t) represents the number of telephone calls received in an interval (0,t)

• X(t) is a discrete random process S ={1,2,3,…..}

•If both S and T are continuous the random process is called a continuous random

process e.g X(t) represents maximum temperature in an interval(0,t) X(t) is a

continuous random process


Random Variable

•A random variable X(A) represents the


functional relationship between a random event
A and a real number
•The random variable may be discrete or
continuous
Distribution Function of a random variable

•The distribution function Fx(x) of a random variable X is given by


•Fx(x) = P(X ≤ x)
•Where P(X ≤ x) is the probability that the random variable X is less
than or equal to real number x.
•The distribution function Fx(x) has the following properties:
•0 ≤ Fx(x) ≤ 1
•Fx (x₁) ≤ Fx( x₂) if x₁ ≤ x₂
•Fx (-∞) = 0
•Fx (+∞) = 1
Probability density function (pdf)
•The probability density function(pdf) denoted px(x) is given by
𝑑𝐹𝑥(𝑥)
•px (x) = 𝑑𝑥

•P(x₁ ≤ X ≤ x₂) = P( x ≤ x₂) - P(x ≤x₁)


= Fx(x₂) - Fx(x₁)
𝑥2
= 𝑝 (x) dx
𝑥1 𝑥

•px(x) has the following properties:


•px(x) ≥ 0
• ∞

𝑝𝑥 (x)dx . = Fx(+∞) - Fx(- ∞)=1

•Sometimes p(x) is used instead of px (x)


Probability mass function(PMF)

•P(X=xi) for probability of random variable X


where X can take on discrete values only
•For discrete random variables, the PMF is also
called the probability distribution.
•PMF can be represented by a table or by a
graph.
Probability mass function(PMF)
For example, let X be the sum of rolling two fair six-
sided dice. The distribution of X is shown in
table below. .

X 2 3 4 5 6 7 8 9 10 11 12
P(X) 1/ 2/ 3/ 4/ 5/ 6/ 7/ 4/ 3/ 2/ 1/
36 36 36 36 36 36 36 36 36 36 36
Joint probability
•The joint CDF of X and Y is
•𝐹𝑋,𝑌 (x,y)=P[X≤x, Y≤ 𝑦]
•The covariance of X and Y is defined as:

COV(X,Y)=E[(X-𝑢𝑋 )(Y-𝑢𝑌 )]= −∞
𝑥𝑦 𝑝𝑋,𝑌 (x,y) dxdy
•Two random variable X and Y are independent if
𝑝𝑋,𝑌 (x,y)=𝑝𝑋 (x).𝑝𝑌 (y) ∀x,y
Conditional Distribution
•For events A and B the conditional probability is defined
𝑃[𝐴,𝐵]
as: P[A|B]=
𝑃[𝐵]

•The conditional distribution of a discrete random variable


X and Y denoted p(x|Y=y) is the distribution function of
X given that we know Y has taken on the value y and is

𝑝𝑋,𝑌(x,y)
defined as p(x|Y=y)=
𝑝𝑌 (y)
Random Processes
•A random process can be seen as a function of two variables:
the event A and time t
•For a specific event Aj we have single time function
X(Aj,t) = Xj (t) i.e a sample function
•The totality of all sample functions is called an ensemble.
•For a specific event A= Aj and a specific time tk , X(Aj,tk) is .
is a number
for convenience random process is designated X(t)
Random Processes
Statistical Averages
•Value of random process is unknown
•The pdf is known
•Mean of a random process X(t)

• E{X(tk)} = −∞
𝑥 𝑝𝑥𝑘 (x)dx = 𝜇x (tk)

•𝑝𝑥𝑘 (x ) is the pdf over the ensemble of events at tk


Autocorrelation function of the random process X(t) to be a function of two variables t₁ and t₂ is defined
as,
• Rx (t₁, t₂) = E{X(t₁) X(t₂)}
•Where X(t₁) and X(t₂) are random variables obtained by observing X(t) at times t₁ and t₂
•Autocorrelation function is a measure of the degree to which two samples of the same random process
are related.
Moments of a random variable
E{.} is called the expected value operator.
The nth moment of a probability distribution of a random variable X is defined by
∞ 𝑛
E{Xⁿ} = −∞
𝑥 𝑝ₓ (𝑥) 𝑑𝑥
Most important moments are the first two moments
n= 1 and n= 2
n=1 gives the mean value
Mean value, mₓ, or Expected value of random variable, X is defined by:

mₓ = E {X} = −∞
𝑥 𝑝ₓ (𝑥) 𝑑𝑥
n=2 gives the mean square value
∞ 2
E(X²) = −∞
𝑥 𝑝ₓ (𝑥) 𝑑𝑥
Central moments
Central moments are moments of difference between X and 𝜇ₓ
The second central moment is called variance
var (𝑋) = E {(X-𝜇x)²}

= −∞
(x-𝜇x)²pₓ (x) dx
Var (X) is denoted by 𝝈ₓ²
𝝈ₓ is called standard deviation of X
Variance is a measure of randomness of the random variable X
Relation between variance and mean

•𝝈ₓ² = E{ X² - 2𝜇ₓ X + 𝜇ₓ²}


• = E {X²} - 𝝁ₓ²
•Variance is equal to the difference between the
mean square value and the square of the mean
Expectation and Variance of a Discrete Random
variable


•E(X)= −∞ 𝑥 𝑝𝑋 (x)

•VAR(X)= E [(X-𝝁)²] = −∞(𝑥 − 𝑢)2 𝑝𝑋 (x)
=E (X²) - 𝝁ₓ²
Stationarity
A random process X(t) is said to be stationary in the strict
sense if none of its statistics are affected by a shift in the
time origin.
A random process is said to be wide sense stationary (WSS)
if two of its statistics, mean and autocorrelation function, do
not vary with time.
A Random Process of WSS if
E{X(t)} = mx = a constant
and Rx (t₁, t₂) = Rx (t₁ - t₂)
Autocorrelation of a WSS process

•For a wide sense stationary process the autocorrelation


function is only a function of the time difference 𝜏 = t₁ - t₂
• Rx(𝜏) = E{X(t) X(t+𝜏) } for -∞< 𝜏 < ∞
For a zero mean WSS process Rx(𝝉)indicates the extent to
which the random values of the process separated by 𝜏
seconds are statistically correlated
WSS Process
Properties of a real valued WSS Process
R x (𝜏) = R x(-𝜏) symmetrical in 𝜏 about 0
Rx (𝜏) ≤ R x (0) for all 𝜏 maximum value occurs at
origin
R x(𝜏) ↔ G x (f) autocorrelation and power spectral
density form a Fourier transform pair
R x (0) = E {X² (t)} value at the origin is equal to the
average power of the signal
Time Averaging and Ergodicity

A process is ergodic if its time average is equal to the ensemble average


and the statistical properties can be determined by time averaging over
a single sample function of the process.
Random process is ergodic in the mean if:
1 𝑇/2
mx = Lim T → ∞ 𝑇 −𝑇/2 X (t) dt

And in the autocorrelation function if:


1 𝑇/2
Rx (𝜏) = Lim T →∞ X (t) X (t + 𝜏) dt
𝑇 −𝑇/2
Ergodicity
•For ergodic random processes:

•𝜇x = E {X(t)} is equal to DC level of signal

•𝜇²x is equal to the normalized power in the DC component

•The second moment of X(t), E{X² (t)} is equal to the total

average power

•The quantity of 𝐸 {X² (t)} is equal to the rms of the voltage or

current signal

•The 𝝈²ₓ is equal to the average normalized power in the time

varying or ac component of the signal


Ergodicity
•If the process has zero mean 𝜇x = 𝜇x² = 0 then
𝝈²ₓ = E {X²} and the variance is the same as
mean square value or the variance
represents the total power in the normalized load
•The standardized deviation 𝝈x is the rms value of
the ac component of the signal.
•If 𝜇x = 0 the 𝝈x is the rms value of the signal
Spectral Density
This is the distribution of energy or power in the frequency domain.
Energy Spectral Density (ESD)
The total energy of a real valued signal x(t) defined over the interval -∞ 𝑡𝑜 + ∞ is given by
𝑇/2 2 ∞ 2
Ex = Lim T→ ∞ −𝑇/2
𝑥 (t) dt= −∞
𝑥 (t) dt

Using Parsevals theorem it can be expressed as


∞ 2 ∞
Ex .= −∞
𝑥 (t) dt=. −∞
|𝑋 𝑓 |2 df.
Where X(f) if the FT of the non periodic signal x(t)
Let ∅𝑥 (f)=|x(f)|2

Ex= ∅
−∞ 𝑥
𝑓 𝑑𝑓
∅𝑥 (f) is the energy spectral density
The energy of a signal is the area under the ∅𝑥 (f) vs frequency curve
Power spectral density
The average power Px of real valued power signal
1 𝑇/2 2
Px = Lim T→ ∞ 𝑥 (t) dt
𝑇 −𝑇/2

If x(t) is a periodic signal with period To the signal is classified as a power signal
Parseval’s theorem for a periodic signal
1 𝑇𝑜/2 2 ∞ 2
Px = 𝑇𝑜 −𝑇𝑜/2
𝑥 (t) dt = 𝑛=−∞ |𝑐𝑛 |

Where |𝑐𝑛 |2 terms are the complex Fourier series coefficients of the periodic signal
The power spectral density 𝐺𝑥 (f)= ∞ 𝑐𝑛 2 𝛿(𝑓 − 𝑛𝑓0 )
𝑛=−∞

1 2
For a non periodic signal 𝐺𝑥 (f)=𝑙𝑖𝑚𝑡→∞ 𝑇 |𝑋𝑇 𝑓 |
Power Spectral Density of a Random Process

•X(t) can generally be classified as a power signal having a PSD Gx (f)


•Features:
•Gx (f) ≥ 0 and is always real valued
•Gx (f) = Gx (-f) for X(t) real valued
•G x (f) ↔ Rx (t) PSD and Autocorrelation form a transform pair

•Px = 𝐺 (𝑓)
−∞ 𝑥
df Relation between Normalized power and PSD

You might also like