Applied probability and stochastic processes 2nd Edition Beichelt F. all chapter instant download
Applied probability and stochastic processes 2nd Edition Beichelt F. all chapter instant download
com
https://ebookname.com/product/applied-probability-and-
stochastic-processes-2nd-edition-beichelt-f/
OR CLICK BUTTON
DOWNLOAD EBOOK
https://ebookname.com/product/stochastic-processes-in-science-
engineering-and-finance-1st-edition-frank-beichelt/
ebookname.com
https://ebookname.com/product/probability-and-stochastic-
processes-1st-edition-ionut-florescu/
ebookname.com
https://ebookname.com/product/fundamentals-of-probability-with-
stochastic-processes-3rd-edition-saeed-ghahramani/
ebookname.com
https://ebookname.com/product/histories-of-southeastern-
archaeology-1st-edition-shannon-tushingham-ed/
ebookname.com
https://ebookname.com/product/astrocytes-and-epilepsy-1st-edition-
jacqueline-a-hubbard/
ebookname.com
https://ebookname.com/product/the-architecture-reference-
specification-book-2nd-edition-julia-mcmorrough/
ebookname.com
https://ebookname.com/product/struggling-giants-city-region-
governance-in-london-new-york-paris-and-tokyo-1st-edition-paul-kantor/
ebookname.com
https://ebookname.com/product/leadership-hacks-clever-shortcuts-to-
boost-your-impact-and-results-2nd-edition-stein/
ebookname.com
Document and image compression Barni M. (Ed.)
https://ebookname.com/product/document-and-image-compression-barni-m-
ed/
ebookname.com
Mathematics Second
Edition
Stochastic Processes
applications in science, engineering, finance, computer science, and
operations research. It covers the theoretical foundations for modeling
time-dependent random phenomena in these areas and illustrates Processes
applications through the analysis of numerous practical examples.
New to the Second Edition Second Edition
• Completely rewritten part on probability theory—now more than
double in size
• New sections on time series analysis, random walks, branching
processes, and spectral analysis of stationary stochastic
processes
• Comprehensive numerical discussions of examples, which
replace the more theoretically challenging sections
• Additional examples, exercises, and figures
Presenting the material in a reader-friendly, application-oriented
manner, the author draws on his 50 years of experience in the field to
give readers a better understanding of probability theory and stochastic
processes and enable them to use stochastic modeling in their work.
Many exercises allow readers to assess their understanding of the
topics. In addition, the book occasionally describes connections
between probabilistic concepts and corresponding statistical
approaches to facilitate comprehension. Some important proofs
Frank Beichelt
and challenging examples and exercises are also included for more
theoretically interested readers.
Beichelt
K24109
w w w. c rc p r e s s . c o m
Frank Beichelt
University of the Witwatersrand
Johannesburg, South Africa
CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742
© 2016 by Taylor & Francis Group, LLC
CRC Press is an imprint of Taylor & Francis Group, an Informa business
This book contains information obtained from authentic and highly regarded sources. Reasonable
efforts have been made to publish reliable data and information, but the author and publisher cannot
assume responsibility for the validity of all materials or the consequences of their use. The authors and
publishers have attempted to trace the copyright holders of all material reproduced in this publication
and apologize to copyright holders if permission to publish in this form has not been obtained. If any
copyright material has not been acknowledged please write and let us know so we may rectify in any
future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying, microfilming, and recording, or in any information stor-
age or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www.copy-
right.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222
Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that pro-
vides licenses and registration for a variety of users. For organizations that have been granted a photo-
copy license by the CCC, a separate system of payment has been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are
used only for identification and explanation without intent to infringe.
Visit the Taylor & Francis Web site at
http://www.taylorandfrancis.com
and the CRC Press Web site at
http://www.crcpress.com
CONTENTS
PREFACE
SYMBOLS AND ABBREVIATIONS
INTRODUCTION
10 MARTINGALES
10.1 DISCRETE-TIME MARTINGALES 475
10.1.1 Definition and Examples 475
10.1.2 Doob-Type Martingales 479
10.1.3 Martingale Stopping Theorem and Applications 486
10.2 CONTINUOUS-TIME MARTINGALES 489
10.3 EXERCISES 492
11 BROWNIAN MOTION
11.1 INTRODUCTION 495
11.2 PROPERTIES OF THE BROWNIAN MOTION 497
11.3 MULTIDIMENSIONAL AND CONDITIONAL DISTRIBUTIONS 501
11.4 FIRST PASSAGE TIMES 504
11.5 TRANSFORMATIONS OF THE BROWNIAN MOTION 508
11.5.1 Identical Transformations 508
11.5.2 Reflected Brownian Motion 509
11.5.3 Geometric Brownian Motion 510
11.5.4 Ornstein-Uhlenbeck Process 511
11.5.5 Brownian Motion with Drift 512
11.5.5.1 Definitions and First Passage Times 512
11.5.5.2 Application to Option Pricing 516
11.5.5.3 Application to Maintenance 522
11.5.6 Integrated Brownian Motion 524
11.6 EXERCISES 526
REFERENCES 549
INDEX 553
PREFACE TO THE SECOND EDITION
Probability Theory
X, Y, Z random variables
E(X), Var(X) mean (expected) value of X, variance of X
f X (x), F X (x) probability density function, (cumulative probability) distribution
function of X
F Y (y x), f Y (y x) conditional distribution function, density of Y given X = x
X t , F t (x) residual lifetime of a system of age t, distribution function of X t
E(Y x) conditional mean value of Y given X = x
λ(x), Λ(x) failure rate, integrated failure rate (hazard function)
N(μ, σ 2 ) normally distributed random variable (normal distribution) with
mean value µ and variance σ 2
ϕ(x), Φ(x) probability density function, distribution function of a standard
normal random variable N(0, 1)
f X (x 1 , x 2 , ... , x n ) joint probability density function of X = (X 1 , X 2 , ... , X n )
F X (x 1 , x 2 , ... , x n ) joint distribution function of X = (X 1 , X 2 , ... , X n )
Cov(X, Y), ρ(X, Y) covariance, correlation coefficient between X and Y
M(z) z-transform (moment generating function) of a discrete random
variable or of its probability distribution, respectively
Stochastic Processes
{X(t), t ∈ T}, {X t , t ∈ T} continuous-time, discrete-time stochastic process with
parameter space T
Z state space of a stochastic process
f t (x), F t (x) probability density, distribution function of X(t)
f t 1 ,t 2 ,...,t n (x 1 , x 2 , ... , x n ), F t 1 ,t 2 ,...,t n (x 1 , x 2 , ... , x n )
joint density, distribution function of (X(t 1 ), X(t 2 ), ... , X(t n ))
m(t) trend function of a stochastic process
C(s,t) covariance function of a stochastic process
C(τ) covariance function of a stationary stochastic process
C(t), {C(t), t ≥ 0} compound random variable, compound stochastic process
ρ(s,t) correlation function of a stochastic process
{T 1 , T 2 , ...} random point process
{Y 1 , Y 2 , ...} sequence of interarrival times, renewal process
N integer-valued random variable, discrete stopping time
{N(t), t ≥ 0} (random) counting process
N(s, t) increment of a counting process in (s, t]
H(t), H 1 (t) renewal function of an ordinary, delayed renewal processs
A(t) forward recurrence time, point availability
B(t) backward recurrence time
R(t), {R(t), t ≥ 0} risk reserve, risk reserve process
A, A(t) stationary (long-run) availability, point availability
(n)
p ij , p ij one-step, n-step transition probabilities of a homogeneous,
discrete-time Markov chain
p i j (t); q i j , q i transition probabilities; conditional, unconditional transition rates
of a homogeneous, continuous-time Markov chain
{π i ; i ∈ Z} stationary state distribution of a homogeneous Markov chain
π0 extinction probability, vacant probability (sections 8.5, 9.7)
λj , μj birth, death rates
λ, μ, ρ arrival rate, service rate, traffic intensity λ/μ (in queueing models)
μi mean sojourn time of a semi-Markov process in state i
µ drift parameter of a Brownian motion process with drift
W waiting time in a queueing system
L lifetime, cycle length, queue length, continuous stopping time
L(x) first-passage time with regard to level x
L(a,b) first-passage time with regard to level min(a, b)
{B(t), t ≥ 0} Brownian motion (process)
σ2, σ σ 2 = Var(B(1)) variance parameter, volatility
{S(t), t ≥ 0} seasonal component of a time series (section 6.4), standardized
Brownian motion (chapter 11).
{B(t), 0 ≤ t ≤ 1} Brownian bridge
{D(t), t ≥ 0} Brownian motion with drift
M(t) absolute maximum of the Brownian motion (with drift) in [0, t]
M absolute maximum of the Brownian motion (with drift) in [0, ∞)
{U(t), t ≥ 0} Ornstein-Uhlenbeck process, integrated Brownian motion process
ω, w circular frequency, bandwidth
s(ω), S(ω) spectral density, spectral function (chapter 12)
Introduction
Is the world a well-ordered entirety,
or a random mixture,
which nevertheless is called world-order?
Marc Aurel
Random influences or phenomena occur everywhere in nature and social life. Their
consideration is an indispensable requirement for being successful in natural, econ-
omical, social, and engineering sciences. Random influences partially or fully contri-
bute to the variability of parameters like wind velocity, rainfall intensity, electromag-
netic noise levels, fluctuations of share prices, failure time points of technical units,
timely occurrences of births and deaths in biological populations, of earthquakes, or
of arrivals of customers at service centers. Random influences induce random events.
An event is called random if on given conditions it can occur or not. For instance,
the events that during a thunderstorm a certain house will be struck by lightning, a
child will reach adulthood, at least one shooting star appears in a specified time
interval, a production process comes to a standstill for lack of material, a cancer
patient survives chemotherapy by 5 years are random. Border cases of random events
are the deterministic events, namely the certain event and the impossible event. On
given conditions, a deterministic (impossible) event will always (never) occur. For
instance, it is absolutely sure that lead, when heated to a temperature of over
327.5 0 C will become liquid, but that lead during the heating process will turn to
gold is an impossible event. Random is the shape, liquid lead assumes if poured on an
even steel plate, and random is also the occurrence of events which are predicted from
the form of these castings to the future. Even if the reader is not a lottery, card, or
dice player, she/he will be confronted in her/his daily routine with random influences
and must take into account their implications: When your old coffee machine fails
after an unpredictable number of days, you go to the supermarket and pick a new one
from the machines of your favorite brand. At home, when trying to make your first
cup of coffee, you realize that you belong to the few unlucky ones who picked by
chance a faulty machine. A car driver, when estimating the length of the trip to his
destination, has to take into account that his vehicle may start only with delay, that a
traffic jam could slow down the progress, and that scarce parking opportunities may
cause further delay. Also, at the end of a year the overwhelming majority of the car
drivers realize that having taken out a policy has only enriched the insurance compa-
ny. Nevertheless, they will renew their policy because people tend to prefer moderate
regular cost, even if they arise long-term, to the risk of larger unscheduled cost.
Hence it is not surprising that insurance companies belonged to the first institutions
that had a direct practical interest in making use of methods for the quantitative
evaluation of random influences and gave in turn important impulses for the develop-
2 APPLIED PROBABILITY AND STOCHASTIC PROCESSES
ment of such methods. It is the probability theory, which provides the necessary
mathematical tools for their work.
Probability theory deals with the investigation of regularities random events are
subjected to.
The existence of such statistical or stochastic regularities may come as a surprise to
philosophically less educated readers, since at first glance it seems to be paradoxic-
al to combine regularity and randomness. But even without philosophy and without
probability theory, some simple regularities can already be illustrated at this stage:
1) When throwing a fair die once, then one of the integers from 1 to 6 will appear
and no regularity can be observed. But if a die is thrown repeatedly, then the fraction
of throws with outcome 1, say, will tend to 1/6, and with increasing number of throws
this fraction will converge to the value 1/6. (A die is called fair if each integer has
the same chance to appear.)
2) If a specific atom of a radioactive substance is observed, then the time from the
beginning of its observation to its disintegration cannot be predicted with certainty,
i.e., this time is random. On the other hand, one knows the half-life period of a radio-
active substance, i.e., one can predict with absolute certainty after which time from
say originally 10 gram (trillions of atoms) of the substance exactly 5 gram is left.
3) Random influences can also take effect by superimposing purely deterministic
processes. A simple example is the measurement of a physical parameter, e.g., the
temperature. There is nothing random about this parameter when it refers to a spe-
cific location at a specific time. However, when this parameter has to be measured
with sufficiently high accuracy, then, even under always the same measurement
conditions, different measurements will usually show different values. This is, e.g.,
due to the degree of inaccuracy, which is inherent to every measuring method, and to
subjective moments. A statistical regularity in this situation is that with increasing
number of measurements, which are carried out independently and are not biased by
systematic errors, the arithmetic mean of these measurements converges towards the
true temperature.
4) Consider the movement of a tiny particle in a container filled with a liquid. It
moves along zig-zag paths in an apparently chaotic motion. This motion is generated
by the huge number of impacts the particle is exposed to with surrounding molecules
of the fluid. Under average conditions, there are about 10 21 collisions per second
between particle and molecules. Hence, a deterministic approach to modeling the
motion of particles in a fluid is impossible. This movement has to be dealt with as a
random phenomenon. But the pressure within the container generated by the vast
number of impacts of fluid molecules with the sidewalls of the container is constant.
Examples 1 to 4 show the nature of a large class of statistical regularities:
The superposition of a large number of random influences leads under certain
conditions to deterministic phenomena.
INTRODUCTION 3
Scientist n m m/n
Buffon 4040 2048 0.5080
Pearson 12000 6019 0.5016
Pearson 24000 12012 0.5005
Thus, the more frequently a coin is flipped, the more approaches the ratio m/n the
value 1/2 (compare with example 1 above). In view of the large number of flipp-
ings, this principal observation is surely not a random result, but can be confirmed
by all those readers who take pleasure in repeating these experiments. However,
nowadays the experiment 'flipping a coin' many thousand times is done by a comput-
er with a 'virtual coin' in a few seconds. The ratio m/n is called the relative frequency
of the occurrence of the random event 'head appears.'
Already the expositions made so far may have convinced many readers that random
phenomena are not figments of human imagination, but that their existence is object-
ive reality. There have been attempts to deny the existence of random phenomena by
arguing that if all factors and circumstances, which influence the occurrence of an
event are known, then an absolutely sure prediction of its occurrence is possible. In
other words, the protagonists of this thesis consider the creation of the concept of
randomness only as a sign of 'human imperfection.' The young Pierre Simeon
Laplace (1729 − 1827) believed that the world is down to the last detail governed by
deterministic laws. Two of his famous statements concerning this are: 'The curve
described by a simple molecule of air in any gas is regulated in a manner as certain
as the planetary orbits. The only difference between them lies in our ignorance.' And:
'Give me all the necessary data, and I will tell you the exact position of a ball on a
billiard table' (after having been pushed). However, this view has proved futile both
from the philosophical and the practical point of view. Consider, for instance, a
biologist who is interested in the movement of animals in the wilderness. How on
earth is he supposed to be in a position to collect all that information, which would
allow him to predict the movements of only one animal in a given time interval with
absolute accuracy? Or imagine the amount of information you need and the
corresponding software to determine the exact path of a particle, which travels in a
fluid, when there are 10 21 collisions with surrounding molecules per second. It is an
4 APPLIED PROBABILITY AND STOCHASTIC PROCESSES
unrealistic and impossible task to deal with problems like that in a deterministic way.
The physicist Marian von Smoluchowski (1872 − 1917) wrote in a paper published in
1918 that 'all theories are inadequate, which consider randomness as an unknown
partial cause of an event. The chance of the occurrence of an event can only depend
on the conditions, which have influence on the event, but not on the degree of our
knowledge.'
Already at a very early stage of dealing with random phenomena the need arose to
quantify the chance, the degree of certainty, or the likelihood for the occurrence of
random events. This had been done by defining the probability of random events and
by developing methods for its calculation. For now the following explanation is
given: The probability of a random event is a number between 0 and 1. The imposs-
ible event has probability 0, and the certain event has probability 1. The probability
of a random event is the closer to 1, the more frequently it occurs. Thus, if in a long
series of experiments a random event A occurs more frequently than a random event
B, then A has a larger probability than B. In this way, assigning probabilities to
random events allows comparisons with regard to the frequency of their occurrence
under identical conditions. There are other approaches to the definition of probabili-
ty than the classical (frequency) approach, to which this explanation refers. For
beginners the frequency approach is likely the most comprehensible one.
Gamblers, in particular dice gamblers, were likely the first people, who were in need
of methods for comparing the chances of the occurrence of random events, i.e., the
chances of winning or losing. Already in the medieval poem De Vetula of Richard de
Fournival (ca 1200−1250) one can find a detailed discussion about the total number
of possibilities to achieve a certain number, when throwing 3 dice. Geronimo
Cardano (1501 − 1576) determined in his book Liber de Ludo Aleae the number of
possibilities to achieve the total outomes 2, 3, ..,12, when two dice are thrown. For
instance, there are two possibilities to achieve the outcome 3, namely (1,2) and (2,1),
whereas 2 will be only then achieved, when (1,1) occurs. (The notation (i, j) means
that one die shows an i and the other one a j.) Galileo Galilei (1564 − 1642) proved
by analogous reasoning that, when throwing 3 dice, the probability to get the (total)
outcome 10 is larger than the probability to get a 9. The gamblers knew this from
their experience, and they had asked Galilei to find a mathematical proof. The
Chevalier de Méré formulated three problems related to games of chance and asked
the French mathematician Blaise Pascal (1623 − 1662) for solutions:
1) What is more likely, to obtain at least one 6 when throwing a die four times, or in
a series of 24 throwings of two dice to obtain at least once the outcome (6,6)?
2) How many time does one have to throw two dice at least so that the probability to
achieve the outcome (6,6) is larger than 1/2?
3) In a game of chance, two equivalent gamblers need each a certain number of points
to become winners. How is the stake to fairly divide between the gamblers, when for
some reason or other the game has to be prematurely broken off ? (This problem of
the fair division had been already formulated before de Méré , e.g., in the De Vetula.)
INTRODUCTION 5
Pascal sent these problems to Pierre Fermat (1601 − 1665) and both found their
solutions, although by applying different methods. It is generally accepted that this
work of Pascal and Fermat marked the beginning of the development of probability
theory as a mathematical discipline. Their work has been continued by famous
scientists as Christian de Huygens (1629 − 1695), Jakob Bernoulli (1654 − 1705),
Abraham de Moivre (1667 − 1754), Carl Friedrich Gauss (1777 − 1855), and last
but not least by Simeon Denis de Poisson (1781 − 1840). However, probability theory
was out of its infancy only in the thirties of the twentieth century, when the Russian
mathematician Andrej Nikolajewic Kolmogorov (1903 − 1987) found the solution of
one of the famous Hilbert problems, namely to put probability theory as any other
mathematical discipline on an axiomatic foundation.
Nowadays, probability theory together with its applications in science, medicine,
engineering, economy et al. are integrated in the field of stochastics. The linguistic
origin of this term can be found in the Greek word stochastikon. (Originally, this term
denoted the ability of seers to be correct with their forecasts.) Apart from probability
theory, mathematical statistics is the most important part of stochastics. A key subject
of it is to infer by probabilistic methods from a sample taken from a set of interesting
objects, called among else sample space or universe, to parameters or properties of
the sample space (inferential statistics). Let us assume we have a lot of 10 000
electronic units. To obtain information on what percentage of these units is faulty, we
take a sample of 100 units from this lot. In the sample, 4 units are faulty. Of course,
this figure does not imply that there are exactly 400 faulty units in the lot. But
inferential statistics will enable us to construct lower and upper bounds for the
percentage of faulty units in the lot, which limit the 'true percentage' with a given
high probability. Problems like this led to the development of an important part of
mathematical statistics, the statistical quality control. Phenomena, which depend both
on random and deterministic influences, gave rise to the theory of stochastic
processes. For instance, meteorological parameters like temperature and air pressure
are random, but obviously also depend on time and altitude. Fluctuations of share
prices are governed by chance, but are also driven by periods of economic up and
down turns. Electromagnetic noise caused by the sun is random, but also depends on
the periodical variation of the intensity of sunspots.
Stochastic modeling in operations research comprises disciplines like queueing
theory, reliability theory, inventory theory, and decision theory. All of them play an
important role in applications, but also have given many impulses for the theoretical
enhancement of the field of stochastics. Queueing theory provides the theoretical
fundament for the quantitative evaluation and optimization of queueing systems, i.e.,
service systems like workshops, supermarkets, computer networks, filling stations,
car parks, and junctions, but also military defense systems for 'serving' the enemy.
Inventory theory helps with designing warehouses (storerooms) so that they can on
the one hand meet the demand for goods with sufficiently high probability, and on
the other hand keep the costs for storage as small as possible. The key problem with
dimensioning queueing systems and storage capacities is that flows of customers,
6 APPLIED PROBABILITY AND STOCHASTIC PROCESSES
service times, demands, and delivery times of goods after ordering are subject to
random influences. A main problem of reliability theory is the calculation of the
reliability (survival probability, availability) of a system from the reliabilities of its
subsystems or components. Another important subject of reliability theory is model-
ling the aging behavior of technical systems, which incidentally provides tools for
the survival analysis of human beings and other living beings. Chess automats got
their intelligence from the game theory, which arose from the abstraction of games of
chance. But opponents within this theory can also be competing economic blocs or
military enemies. Modern communication would be impossible without information
theory. This theory provides the mathematical foundations for a reliable transmission
of information although signals may be subject to noise at the transmitter, during
transmission, and at the receiver. In order to verify stochastic regularities, nowadays
no scientist needs to manually repeat thousands of experiments. Computers do this
job much more efficiently. They are in a position to virtually replicate the operation
of even highly complex systems, which are subjected to random influences, to any
degree of accuracy. This process is called (Monte Carlo) simulation. More and very
fruitful applications of stochastic (probabilistic) methods exist in fields like physics
(kinetic gas theory, thermodynamics, quantum theory), astronomy (stellar statistics),
biology (genetics, genomics, population dynamic), artificial intelligence (inference
under undertainty), medicine, genomics, agronomy and forestry (design of experi-
ments, yield prediction) as well as in economics (time series analysis) and social
sciences. There is no doubt that probabilistic methods will open more and more
possibilities for applications, which in turn will lead to a further enhancement of the
field of stochastics.
More than 300 hundreds years ago, the famous Swiss mathematician Jakob Bernoulli
proposed in his book Ars Conjectandi the recognition of stochastics as an independ-
ent new science, the subject of which he introduced as follows:
To conjecture about something is to measure its probability: The Art of conjecturing
or the Stochastic Art is therefore defined as the art of measuring as exactly as possi-
ble the probability of things so that in our judgement and actions we always can
choose or follow that which seems to be better, more satisfactory, safer and more
considered.
In line with Bernoulli's proposal, an independent science of stochastics would have
to be characterized by two features:
1) The subject of stochastics is uncertainty caused by randomness and/or ignorance.
2) Its methods, concepts, and language are based on mathematics.
But even now, in the twenty-first century, an independent science of stochastics is
still far away from being officially established. There is, however, a powerful sup-
port for such a move by internationally leading academics; see von Collani (2003).
PART I
Probability Theory
There is no credibility in sciences in which
no mathematical theory can be applied,
and no credibility in fields which have no
connections to mathematics.
Leonardo da Vinci
CHAPTER 1
Random Events and Their Probabilities
1.1 RANDOM EXPERIMENTS
If water is heated up to 100 0 C at an air pressure of 101 325 Pa, then it will inevitab-
ly start boiling. A motionless pendulum, when being pushed, will start swinging. If
ferric sulfate is mixed with hydrochloric acid, then a chemical reaction starts, which
releases hydrogen sulfide. These are examples for experiments with deterministic
outcomes. Under specified conditions they yield an outcome, which had been known
in advance.
Somewhat more complicated is the situation with random experiments or experim-
ents with random outcome. They are characterized by two properties:
1. Repetitions of the experiment, even if carried out under identical conditions, gen-
erally have different outcomes.
2. The possible outcomes of the experiment are known.
Thus, the outcome of a random experiment cannot be predicted with certainty. This
implies that the study of random experiments makes sense only if they can be repeat-
ed sufficiently frequently under identical conditions. Only in this case stochastic or
statistical regularities can be found.
8 APPLIED PROBABILITY AND STOCHASTIC PROCESSES
Let Ω be the set of possible outcomes of a random experiment. This set is called
sample space, space of elementary events, or universe. Examples of random experi-
ments and their respective sample spaces are:
1) Counting the number of traffic accidents a day in a specified area: Ω = {0, 1, ...}.
2) Counting the number of cars in a parking area with maximally 200 parking bays at
a fixed time point: Ω = {0, 1, ..., 200}.
3) Counting the number of shooting stars during a fixed time interval: Ω = {0, 1, ...}.
4) Recording the daily maximum wind velocity at a fixed location: Ω = [0, ∞).
5) Recording the lifetimes technical systems or organisms: Ω = [0, ∞).
6) Determining the number of faulty parts in a set of 1000: Ω = {0, 1, ..., 1000}.
7) Recording the daily maximum fluctuation of a share price: Ω = [0, ∞).
8) The total profit sombody makes with her/his financial investments a year.
This 'profit' can be negative, i.e. any real number can be the outcome: Ω = (−∞, +∞).
9) Predicting the outcome of a wood reserve inventory in a forest stand: Ω = [0, ∞).
10) a) Number of eggs a sea turtle will bury at the beach: Ω = {0, 1, ...}.
b) Will a baby turtle, hatched from such an egg, reach the water? Ω = {0, 1} with
meaning 0: no, 1: yes.
As the examples show, in the context of a random experiment, the term 'experiment'
has a more general meaning than in the customary sense.
A random experiment may also contain a deterministic component. For instance, the
measurement of a physical quantity should ideally yield the exact (deterministic)
parameter value. But in view of random measurement errors and other (subjective)
influences, this ideal case does not materialize. Depending on the degree of accuracy
required, different measurements, even if done under identical conditions, may yield
different values of one and the same parameter (length, temperature, pressure, amper-
age,...).
3) In training, a hunter shoots at a cardboard dummy. Given that he never fails the
dummy, the latter is the sample space Ω, and any possible impact mark at the dum-
my is an elementary event. Crucial subsets to be hit are e.g. 'head' or 'heart.'
Already these three examples illustrate that often not single elementary events are
interesting, but sets of elementary events. Hence it is not surprising that concepts and
results from set theory play a key role in formally establishing probability theory. For
this reason, next the reader will be reminded of some basic concepts of set theory.
Basic Concepts and Notation from Set Theory A set is given by its elements. We
can consider the set of all real numbers, the set of all rational numbers, the set of all
people attending a performance, the set of buffalos in a national park, and so on. A
set is called discrete if it is a finite or a countably infinite set. By definition, a count-
ably infinite set can be written as a sequence. In other words, its elements can be
numbered. If a set is infinite, but not countably infinite, then it is called nondenumer-
able. Nondenumerable sets are for instance the whole real axis, the positive half-axis,
a finite subinterval of the real axis, or a geometric object (area of a circle, target).
Let A and B be two sets. In what follows we assume that all sets A, B, ... considered
are subsets of a 'universal set' Ω . Hence, for any set A, A ⊆ Ω .
A is called a subset of B if each element of A is also an element of B.
Symbol: A ⊆ B.
The complement of B with regard to A contains all those elements of B which are not
element of A.
Symbol: B\A
In particular, A = Ω\A contains all those elements which are not element of A.
The intersection of A and B contains all those elements which belong both to A and B.
Symbol: A ∩ B
The union of A and B contains all those elements which belong to A or B (or to both).
Symbol: A ∪ B
These relations between two sets are illustrated in Figure 1.1 (Venn diagram). The
whole shaded area is A B.
Ω B
A
A∩B
B\A
A\B
Random Events A random event (briefly: event) A is a subset of the set Ω of all
possible outcomes of a random experiment, i.e. A ⊆ Ω.
A random event A is said to have occurred as a result of a random experiment
if the observed outcome ω of this experiment is an element of A: ω ∈ A.
The empty set ∅ is the impossible event since, for not containing any elementary
event, it can never occur. Likewise, Ω is the certain event, since it comprises all pos-
sible outcomes of the random experiment. Thus, there is nothing random about the
events ∅ and Ω. They are actually deterministic events. Even before having complet-
ed a random experiment, we are absolutely sure that Ω will occur and ∅ will not.
Let A and B be two events. Then the set-theoretic operations introduced above can be
interpreted in terms of the occurrence of random events as follows:
A ∩ B is the event that both A and B occur,
A B is the event that A or B (or both) occur,
If A ⊆ B (A is a subset of B), then the occurrence of A implies the occurrence of B.
A\ B is the set of all those elementary events which are elements of A, but not of B.
Thus, A\ B is the event that A occurs, but not B. Note that (see Figure 1.1)
A\ B = A\ (A ∩ B). (1.3)
The event A = Ω\ A is called the complement of A. It consists of all those elementary
events, which are not in A.
Two events A and B are called disjoint or (mutually) exclusive if their joint occur-
rence is impossible, i.e. if A ∩ B = ∅. In this case the occurrence of A implies that B
cannot occur and vice versa. In particular, A and A are disjoint for any event A ⊆ Ω .
Short Terminology
A∩B A and B
A B A or B
A⊆B A implies B, B follows from A
A\B A but not B
A A not
Random documents with unrelated
content Scribd suggests to you:
The Project Gutenberg eBook of Le meilleur
ami
This ebook is for the use of anyone anywhere in the United States
and most other parts of the world at no cost and with almost no
restrictions whatsoever. You may copy it, give it away or re-use it
under the terms of the Project Gutenberg License included with this
ebook or online at www.gutenberg.org. If you are not located in the
United States, you will have to check the laws of the country where
you are located before using this eBook.
Language: French
LE
MEILLEUR AMI
— ROMAN —
CINQUIÈME ÉDITION
PARIS
CALMANN-LÉVY, ÉDITEURS
3, RUE AUBER, 3
Je descendis avec les deux amis. Dans la rue, celui de ces jeunes
gens qui n’était encore qu’élève de l’École des sciences politiques
envia le sort de Claude : c’était une chance de posséder une
maîtresse si correcte. L’auditeur de première classe au Conseil d’État
souleva l’épaule et dit que cette liaison était au contraire déplorable
et qu’elle ruinerait l’avenir de Gérard.
— Cette liaison n’est pas éternelle, hasardai-je en riant.
L’auditeur avança les lèvres et me regarda de biais. Je repris :
— Gérard n’est pas esclave ; il a une maîtresse qu’il voit deux fois
par semaine, bon ; mais, entre temps, il sort, il est libre ; il
commence à aller dans le monde…
— Avec quelles précautions ! quelle abondance de cachotteries !
Sa soirée costumée a été l’escapade nocturne d’un collégien, d’un
gamin qui s’échappe par la fenêtre !
— Elle ne lui a causé que plus de plaisir : il recommencera.
— Mais le plaisir qu’il éprouve à fuir en cachette vient de ce qu’il
se sent prisonnier !…
Et l’auditeur au Conseil d’État prophétisa :
— Gérard épousera Isabelle !
Je ne pus m’empêcher de rire. Le plus jeune de ces messieurs fit
comme moi et s’écria :
— Et l’autre ?…
L’auditeur au Conseil d’État ne broncha pas, car il ne me croyait
pas informé. Je dis alors, moi aussi :
— Oui, en effet, et l’autre ?…
Il fut surpris un instant, me regarda, comprit qu’Isabelle m’avait
parlé dès la première entrevue comme elle l’avait fait sans doute à
lui-même. Il dit :
— L’autre ?… Eh bien, oui, ce sera alors probablement notre
devoir d’avertir Claude qu’il n’est pas le seul amant d’Isabelle, et
alors…
— Alors, dit le jeune homme, il faudra bien qu’il rompe avec sa
maîtresse.
— Alors, dit l’auditeur, il rompra avec nous et il épousera sa
maîtresse !