0% found this document useful (0 votes)
16 views

04-A Probability Models

Probability models form the foundation of information theory and are used to characterize and predict signals. Random signals can be described statistically using properties like mean, variance, and correlation. Probability models mathematically describe the likelihood of different outcomes from random processes. They enable estimation of likely process values from incomplete observations by expressing random processes as functions of statistical parameters.

Uploaded by

anndayo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

04-A Probability Models

Probability models form the foundation of information theory and are used to characterize and predict signals. Random signals can be described statistically using properties like mean, variance, and correlation. Probability models mathematically describe the likelihood of different outcomes from random processes. They enable estimation of likely process values from incomplete observations by expressing random processes as functions of statistical parameters.

Uploaded by

anndayo
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Probability Models

Probability models form the foundation of information theory. It is used in communications and signal
processing system to characterise and predict signals in diverse areas of applications such as image and
video processing.
Random Process
Signal that can convey information can be classified as either deterministic signal or random signals.
Deterministic signals can only act as information carriers while random signals can be an information-
bearing signal. In each class, a signal may be continuous or bounded in time, continuous valued or
discrete valued signals, and one-dimensional or multi-dimensional.
Deterministic signals such as a sine wave can be described in terms of a function of time and the
predictability of its values in terms of time. Random signals have unpredictable fluctuations, thus, there
is no exact equation to predict its future value. Examples of a random signal are speech, music and
noise. Random signals often exhibits a set of well-defined statistical values such as maximum, minimum,
mean, median, variance, correlation, power spectrum and higher order statistics. Thus it can be
described using statistics and its corresponding statistical model.
Stochastic and Random Process
Random process may be described as any process or function that generates random signals. Stochastic
process, on the other hand, is used to describe a random process that generates sequential random
signals such as sequence of speech samples. In signal processing systems, these processes are also
probability mode of a class of random signals such as Gaussian process, Markov Process, Poisson
Process, binomial process, multinomial process and other similar processes.
Probability models of random signals
Probability models are used to calculate the odds for the different outcomes in a game of chance. It can
provide mathematical description of the distribution of the likelihood of the different outcomes of a
random process. A good probability model should be able to reflect the fraction of times that the
outcome is observed to occur.
It should be noted that people often quantify their intuitive belief/feeling in the probability of the
outcome of a process in terms of a number between 0 and 1 or its equivalent percentages. A probability
of zero expresses the impossibility of occurrence of an event while a probability of 1 means that the
event is certain to happen.
Probability models enable the estimation of the likely values of a process from noisy or incomplete
observations. It can describe random processes that are discrete-value, continuous-valued or finite-
state continuous valued. It is expressed as functions of statistical parameters of the random process.
Probability and random variables
Random variables are outcomes from a random process and have degree of uncertainty as to what
would be the next value of the variable. The space of a random variable is a collection of all the values
or outcomes that the variable can assume. It can be partitioned, according to some criteria, into a
number of subspaces. Thus, a subspace can be describe as a collection of values with a common
attribute. Each subspace is called an event or probability of an event A. P(A) is the ratio of the number
of observed outcomes from the space of A divided by the total number of observations N
A
.




The sum of all probabilities of all outcomes in an experiment is one.
Discrete Random Variables
A discrete random variable X belongs to the finite set of N numbers or symbols

. Each
outcome

can then be assigned a probability of occurrence. An example is tossing a coin where the
outcome X, can be either be {H,T}, the probability that X={H,T} is 0.5.
The probability that a discrete-valued random variable X takes on a value of

, P(X=

) is called the
probability mass function. For two random variables X and Y, the probability of an outcome in which X
takes on a value of

and Y takes on a value of

, is called joint probability mass


function. The joint probability mass function can be describe in terms of conditional and marginal
probability mass function as


Where

is the conditional probability of the random variable Y takin on the value


conditioned on the variable X having taken a value of

. The marginal probability mass function of X


can be obtained as


M is the number of values, or outcomes in the space of discrete random variable Y.

You might also like