0% found this document useful (0 votes)
5 views

TD PATD

Uploaded by

Belgacem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

TD PATD

Uploaded by

Belgacem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Université de Gabes, ENIG Module TS. Mr CHIBANI B.

TD. PATD, GCR2 24-25

Discrete Random Processes

Exercice 1. Considérons des patients qui se rendent au cabinet d’un médecin. Ces patients arrivent au
cabinet à des instants ti, de manière totalement aléatoire. Soit X n une variable qui dénote le temps d’attente
du patient (exprimé en heures, h) avant qu’il accède pour être effectivement diagnostiqué par le médecin.
1. Décrire un PA Xn , n  1 ?
2. Dessiner une réalisation type de ce PA Xn ?

Solution.
1. Le PA Xn est un processus à temps discret. Toutefois :mais il est à valeurs continues. L’espaceest SX
={x: x≥0}. L’ensemble des indices des parameters est (domain of time) est: I ={1,2,3,···}.
2. un exemple de réalisation est:

Exercice 2.

1.

2.
NB.
- Pour une v.a. discrète X, la PMF (fonction de masse de probabilité) est une fonction f définie
- par f(x) := P(X=x)

- Pour une v.a. discrète X, chaque ... p(X) représente la fonction masse (de probabilité) ( p.m.f.)
pour X.
Solution.
Noter que Xn est un P.A. i.i.d. et pour chaque n fixé, Xn est une v. a. de Bernoulli avec p = 1/3 . On a; E[X n]
= p = 1/3 et var[Xn] = p.(1−p) = (1/3).(2/3) = 2/9
E[X2n] = var (Xn) + E2(Xn) = 2/9 + (1/3)2 = 2/9+1/9 =3/9 =1/3

1
Université de Gabes, ENIG Module TS. Mr CHIBANI B.
TD. PATD, GCR2 24-25

1. Pour chaque n fixé, Yn est une v.a. binomiale avec p = 1/3. On obtient:

2. On obtient: mY(n) = E(Yn) = np = n/3

En termes de covariance CYY de Yn, on aura:

Exercice 3.

2
Université de Gabes, ENIG Module TS. Mr CHIBANI B.
TD. PATD, GCR2 24-25

Exercice 4.

Fin

3
Université de Gabes, ENIG Module TS. Mr CHIBANI B.
TD. PATD, GCR2 24-25

-----------------------------
Example — Random Telegraph signal
Let a random signal X(t) have the structure X(t) = (−1) N(t)Y, t ≥ 0, 3where {N(t),t ≥ 0} is a homogeneous
Poisson process with intensity λ and Y is a binary random variable with p(Y = 1) = p(Y = −1) = 1/2 which
is independent of N(t) for all t. Signals of this structure are called random telegraph signals. Random
telegraph signals are basic modules for generating signals with a more complicated structure. Obviously,
X(t) = 1 or X(t) = −1 and Y determines the sign of X(0). Since |X(t)|2 = 1 < ∞ for all t ≥ 0, the stochastic
process {X(t),t ≥ 0} is a second order process. Letting I(t) = (−1)N(t), its trend function is m(t) = E[X(t)] =
E[Y ]E[I(t)]. Since E[Y ] = 0, the trend function is identically zero: m(t) ≡ 0. It remains to show that the
covariance function C(s,t) of this process depends only on |t −s|. This requires the determination of the
probability distribution of I(t). A transition from I(t) = −1 to I(t) = +1 or, conversely, from I(t) = +1 to I(t) =
−1 occurs at those time points where Poisson events occur, i.e. where N(t) jumps. P(I(t) = 1) = P(even
number of jumps in [0,t]) ∞ =e−λt i=0 (λt)2i (2i)! = e−λtcoshλt, P(I(t) = −1) = P(odd number of jumps in
[0,t]) ∞ =e−λt Hence the expected value of I(t) is (λt)2i+1 i=0 (2i +1)! = e−λtsinhλt. E[I(t)] = 1 · P(I(t) =
1)+(−1)·P(I(t) = −1) =e−λt[coshλt −sinhλt] =e−2λt. Since C(s,t) = COV[X(s),X(t)] = E[(X(s)X(t))] =
E[YI(s)YI(t)] =E[Y2I(s)I(t)] = E(Y2)E[I(s)I(t)] and E(Y2) = 1, C(s,t) = E[I(s)I(t)]. Thus, in order to
evaluate C(s,t), the joint distribution of the random vector (I(s),I(t)) must be determined. In view of the
homogeneity of the increments of {N(t),t ≥ 0}, for 4s <t, p1,1 = P(I(s) = 1,I(t) = 1) = P(I(s) = 1)P(I(t) = 1|
I(s) = 1) =e−λscoshλs P(even number of jumps in (s,t]) =e−λscoshλs e−λ(t−s) coshλ(t − s)
=e−λtcoshλscoshλ(t −s). Analogously, p1,−1 = P(I(s) = 1,I(t) = −1) = e−λtcoshλssinhλ(t−s) p−1,1 = P(I(s)
= −1,I(t) = 1) = e−λtsinhλssinhλ(t−s) p−1,−1 = P(I(s) = −1,I(t) = −1) = e−λtsinhλscoshλ(t−s). Since
E[I(s)I(t)] = p1,1 + p−1,−1 − p1,−1 − p−1,1, we obtain C(s,t) = e−2λ(t−s),s < t. Note that the order of s and
t can be changed so that C(s,t) = e−2λ|t−s|. Example 5 A random process is defined by X(t) = T +(1−t)
where T is a uniform random variable in (0,1). (a) Find the cdf of X(t). (b) Find mX(t) and CX(t1,t2).
Solution Given that X(t) = T +(1−t), where T is uniformly distributed over (0,1), we then have P[X(t) ≤ x]
= P[T ≤ x−(1−t)]; 0 y <0 P[T ≤y] = Write x−(1−t) = y, then y 1 FX(x) = P[X(t) ≤ x] = P[T ≤ x−(1−t)] = 0
<y <1 y >1 0 . x <1−t x−(1−t) 1−t<x<2−t 1 x >2−t 5 ;fX(x) = d dxFX(x) = 1 . 0 2−t mX(t) = Note that E[T]
= 1 2 and E[T2] = 1 3 . 1−t x dx = x2 2 1 −t<x<2−t otherwise 2−t 1−t = 3 2 −t. Alternatively, mX(t) =
E[X(t)] = E[T +(1 −t)] = 1−t+E[T] = 3 2 −t. Define RX(t1,t2) = E[{T +(1−t1)}{T +(1−t2)}] =E[T2]+(1−t1
+1−t2)E[T]+(1−t1)(1−t2) and observe CX(t1,t2) = RX(t1,t2) − mX(t1)mX(t2) = 1 3 + (2−t1 −t2) 2 = 1 12

--------------------------------------------
Cf.chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://www.math.hkust.edu.hk/~maykwok/courses/ma
246/ 04_05/04MA246EX_Ran.pdf.

You might also like