0% found this document useful (0 votes)
34 views

CH 1

The document discusses stochastic modeling and various discrete probability distributions. It introduces stochastic processes and defines random variables. It then covers events, probabilities, and conditional probabilities. The major discrete distributions covered are the Bernoulli, binomial, geometric, negative binomial, and Poisson distributions.

Uploaded by

Win Myo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views

CH 1

The document discusses stochastic modeling and various discrete probability distributions. It introduces stochastic processes and defines random variables. It then covers events, probabilities, and conditional probabilities. The major discrete distributions covered are the Bernoulli, binomial, geometric, negative binomial, and Poisson distributions.

Uploaded by

Win Myo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Chapter I

Introduction
Stochastic Modeling
   The word "stochastic" means "random" or "chance" .
 A stochastic model predicts a set of possible outcomes
weighted by their likelihoods, or probabilities.
1.1. Stochastic Processes
 A stochastic process is a family of random variables where t is a
parameter running over a suitable index set T. Write instead of
.In a common situation, the index t corresponds to discrete units
of time, and the index set is In this case, might represent the
outcomes at successive tosses of a coin, repeated responses of a
subject in a learning experiment, or successive observations of
some characteristics of a certain population.
   Stochastic processes are distinguished by their state
space, or the range of possible values for the random
variables by their index set , and by the dependence
relations among the random variables
  Probability Review
2.
2.1. Events and Probabilities
Let and be events. The event that at least one of or occurs
is called the union of and and is written ; the event that both
occur is called the intersection of and and is written , or
simply . This notation extends to finite and countable sequences
of events. Given events
the event that at least one occurs is written
,
the event that all occur is written .
  
The probability of an event A is written The certain event,
denoted by , always occurs, and Pr. The impossible event,
denoted by , never occurs, and . It is always the case that Pr for
any event . Events are said to be disjoint if ; that is, if and
cannot both occur. For disjoint events

The addition law


 Let be events with and disjoint whenever . Then

   law of total probability
The
 Let be disjoint events for which
Equivalently, exactly one of the events will occur. The law
of total probability asserts that for any event .
 Events and are said to be independent if

 Events are independent if

for every finite set of distinct indices


   Random Variables
2.2
 A variable that takes on its values by chance.
 Using uppercase letters such as to denote random variables, and
lowercase letters such as for real numbers.
 The expression is the event that the random variable assumes a
value that is less than or equal to the real number. This event may
or may not occur, depending on the outcome of the experiment or
phenomenon that determines the value for the random variable .
The probability that the event occurs is written Pr. Allowing to
vary, this probability defines a function

Pr
   Conditional Probability
2.7.
For any events and , the conditional probability of given is written and
defined by
if (2.16)
and is left undefined if .
Multiplicative form
(2.17)
to compute other probabilities.

The law of total probability

into

where and if to yield


Example Gold and silver coins are allocated among three urns labeled
I, II, III according to the following table:

Urn Number of Gold Coins Number of Silver Coins

I 48

II 39

III 6 6

An urn is selected at random, all urns being equally likely, and then a
coin is selected at random from that urn. Using the notation I, II, III for
the events of selecting urns I, II, and III, respectively, and G for the
event of selecting a gold coin, then the problem description provides the
following probabilities and conditional probabilities as data:
  

the probability of selecting a gold coin


  The Major Discrete Distributions
3.
3.1. Bernoulli Distribution
A random variable X following the Bernoulli distribution
with parameter p has only two possible values, 0 and 1, and
the probability mass function is
and where
The mean and variance are
and Var
  
3.2.Binomial Distribution

Consider independent events all having the same


probability of occurrence. Let count the total number of
events among that occur. Then has a binomial distribution
with parameters and The probability mass function is

for

The mean and variance are

and Var
   Geometric and Negative Binomial Distributions
3.3.
Let be independent events having a common probability of
occurrence. Say that trial is a success (S) or failure (F) according as
occurs or not, and let count the number of failures prior to the first
success. Then has a geometric distribution with parameter p. The
probability mass function is
for
and the first two moments are
and Var
Sometimes the term "geometric distribution" is used in referring to the
probability mass function
for
   The Poisson Distribution
3.4.
The Poisson Distribution with parameter has the probability mass
function
for
Using this series expansion

The mean and variance are same.


and Var

the binomial distribution with parameters and converges to the

Poisson with parameter if and in such a way that remains constant.

You might also like