The Annals of Applied Probability 10.1214/09-AAP637 Institute of Mathematical Statistics
The Annals of Applied Probability 10.1214/09-AAP637 Institute of Mathematical Statistics
where
Zt −Zu −(1/2)([Z,Z]c −[Z,Z]c )
e
t u
Y
(2) Uu,t = × (1 + ∆Zs )e−∆Zs , 0 ≤ u < t,
u<s≤t
1, 0≤u=t
and [Z, Z] is the quadratic variation process associated with Z. When Z is
BV then (2) reduces to
c c Y
eZt −Zu (1 + ∆Zs ), 0 ≤ u < t,
(3) Uu,t = u<s≤t
1, 0 ≤ u = t,
Proof. Note that with T0 = 0 and for n ≥ 1, Tn = inf{t > Tn−1 |∆Zt =
−1}, then for Tn < u ≤ t < Tn+1
UTn ,t UTn ,t
(4) = Uu,t (1 + ∆Zu ), = Uu,t .
UTn ,u− UTn ,u
Also, since
P Y is a BV process, the covariation process [Y, Z] is given via
[Y, Z]t = 0≤s≤t ∆Ys ∆Zs . If one follows the solution in equation (6.9) in
Theorem (6.8) on page 194 of [8], then for Tn ≤ t < Tn+1 we have that
Z Z
−1 −1
Xt = UTn ,t ∆YTn + UTn ,u− dYu − UTn ,u d[Y, Z]u
(Tn ,t] (Tn ,t]
Z X
(5) = UTn ,t ∆YTn + Uu,t (1 + ∆Zu ) dYu − Uu,t ∆Yu ∆Zu
(Tn ,t] Tn <u≤t
Z
= Uu,t dYu ,
[Tn ,t]
where the second equality is justified since the first integral on the right-hand
side of the first equality is a path-wise Stieltjes integral, and the second is a
sum which is also defined path-wise. If Y was a general semimartingale, then
interchanging UTn ,t with the integral sign like this would not be justified as
the resulting integrand would no longer be adapted. Clearly if n ≥ 1, then
Uu,t = 0 for u < Tn , and thus
Z
(6) Xt = Uu,t dYu .
[0,t]
Since this holds for all n, the proof for the more general case is complete.
For the case c
Pwhere Z is BV, it is evident that [Z, Z] = 0, and it is easy to
check that u<s≤t ∆Zs is convergent (actually, absolutely convergent), and
hence the result follows.
which is a.s. finite for all t ≥ 0 and right-continuous (possibly a.s. identically
zero or terminating), and write
Z
(8) Xt = Uu,t dYu .
[TNt ,t]
It is worth while to note that for the case where Z is also a BV process,
there is a more direct proof involving (path-wise) Stieltjes integration which
4 O. KELLA AND M. YOR
and
d
Y
(10) 1 + Btd − B0 ≤ (1 + ∆Bs ) ≤ eBt −B0
0<s≤t
as t ↓ 0.
c
Now note that with Ct = eZt and Dt = 0<s≤t (1 + ∆Zs ), ordinary (Stielt-
Q
jes) integration by parts yields
Z Z X
Ut ≡ Ct Dt = C0+ D0+ + Ds− dCs + Cs− dDs + ∆Cs ∆Ds ,
(0,t] (0,t] 0<s≤t
(12)
and it is easy to check that the continuity of C and the fact that dCt = Ct dZtc
imply that
Z
(13) Ut = 1 + Us− dZs .
(0,t]
eJs
R
If dYs is a.s. finite (recalling that for s ≤ 0, Js ≤ J0 = 0), then
(−∞,0]
= (−∞,t] e−(Jt −Js ) 1{Nt −Ns =0} dYs it is clear that X ∗ is a station-
Xt∗
R
setting
ary process. Moreover, if, in addition, either limt→∞ Nt ≥ 1 a.s. (equiva-
lently, T1 = inf{t|∆Zt = 1} is a.s. finite) or Jt → ∞ a.s. as t → ∞, then
6 O. KELLA AND M. YOR
|Xt∗ − Xt | → 0 a.s. as t → ∞, and thus for any a.s. finite initial X0 , a limit-
ing distribution exists which is distributed like X0∗ .
In fact, when X0 is independent of (Y, Z), then shifting by −t, noting
that θ−t Js = Js−t − J−t (so that θ−t Jt = 0) and similarly for N and Y , it is
clear that Xt has the same distribution as
Z
X0 eJ−t 1{N−t =0} + eJs−t 1{Ns−t =0} dYs−t
(0,t]
(21) Z
= X0 eJ−t 1{N−t =0} + eJs 1{Ns =0} dYs .
(−t,0]
In particular, this implies that when X0 = 0, then Xt is stochastically in-
creasing in t ≥ 0.
Let us summarize our findings as follows.
Theorem 2. If (−∞,0] eJs dYs < ∞ a.s., and either T1 < ∞ a.s. or Jt →
R
We note that when (Y, Z) also have independent increments so that they
form a Lévy process, then the negative of the time reversed process is a
left-continuous version of the forward process, and thus in this case [when
X0 is independent of (Y, Z)], Xt is also distributed like
Z
−Jt
(23) X0 e 1{Nt =0} + e−Js 1{Ns =0} dYs
(0,t]
which is also the consequence of the usual time reversal argument for Lévy
processes. In what follows we will consider special cases of this structure.
We observe that in the general case N is a simple (i.e., a.s. ∆Nt ∈ {0, 1}
for all t) counting process associated with a time stationary point process.
Special cases of such processes are Poisson processes and delayed renewal
processes where the delay has the stationary excess lifetime distribution as-
sociated with the subsequent i.i.d. inter-renewal times. We will consider this
special case a bit later.
the Lévy measure of which, call it νj , is defined via νj ((a, b]) = νz ((1 −
e−a , 1 − e−b ]) for 0 < a < b < ∞ and with exponent
Z
ηj (α) = cz α + (1 − e−αx )νj (dx)
(0,∞)
(28) Z
= cz α + (1 − (1 − x)α )νz (dx),
(0,1)
We note that
Z Z
(30) min(x, 1)νj (dx) = min(− log(1 − x), 1)νz (dx),
(0,∞) (0,1)
x
and since − log(1 − x) ≤ 1−x ≤ xe for 0 < x ≤ 1 − e−1 , the right-hand side
R
is dominated above by e (0,1) xνz (dx) < ∞, so that νj is indeed the proper
Lévy measure of a subordinator. Now, for this case, Ee−Js = e−ηj (1)s where
Z
ηj (1) = cz + (1 − (1 − x)1 )νz (dx)
(0,1)
(31) Z
= cz + xνz (dx) = ηz′ (0) − λ
(0,1)
8 O. KELLA AND M. YOR
′ ′
recalling λ = νz {1}. Therefore, Ee−Js R1{Ns =0} = e−(ηz (0)−λ)s e−λs = e−ηz (0)s
so that in this case, since ηz′ (0) = cz + (0,1] xνz (dx) = EZ1 , (25) becomes
EY1
(32) EXt = EX0 e−EZ1 t + (1 − e−EZ1 t ).
EZ1
Recall that here Y need not have independent increments.
It thus follows, as in equation (5.9) of [13] for the more general multivariate
case and in Proposition 1 of [19] for the case where Y and Z are compound
Poisson, that
Z
−Js
E exp −α e 1{Ns =0} dYs Z
(0,t]
(34) Z t
−Js
= exp − ηy (αe 1{Ns =0} ) ds .
0
yields
Z t
−αXt −Jt −Js
Ee = Eξ0 (αe 1{Nt =0} ) exp − ηy (αe )1{Ns =0} ds
0
Z t
−Jt −Js
(36) = Eξ0 (αe ) exp − ηy (αe ) ds 1{T1 >t}
0
Z T1
−Js
+ E exp − ηy (αe ) ds 1{T1 ≤t} .
0
LINEAR STOCHASTIC EQUATIONS 9
(42)
n Z
X n k−1
= cz n + (−1) xk νz (dx),
k (0,1]
k=1
(0) (k)
and since, ηz (0) = ηz (0) = 0, ηz′ (0) = cz +
R
(0,1) xνz (dx) and ηz (0) =
(−1)k−1 (0,1] xk νz (dx), for k ≥ 2, it holds that
R
n
X n
(43) ηj (n) + λ = ηz(k) (0).
k
k=0
In particular ηj (1) + λ = ηz′ (0) = cz + (0,1] xν(dx) and ηj (2) + λ = 2ηz′ (0) +
R
b ≥ 1,
Z t b Z t Z t b−1
−aJt −Js −aJt −Js
Ee e ds = bEe e ds e−Ju du
0 0 u
Z t Z t b−1
(48) =b Ee−a(Jt −Ju ) e−(Js −Ju ) ds e−(a+b)Ju du
0 u
Z t Z t−u b−1
=b e−ηj (a+b)u Ee−aJt−u e−Js ds .
0 0
Thus, if T ∼ exp(θ) for some θ > 0 and is independent of Z, then since
the conditional distribution of T − u given T > u is the same as that of T
(memoryless property), it readily follows that
Z T b Z T b−1
−aJT −Js b −aJT −Js
(49) Ee e ds = Ee e ds .
0 ηj (a + b) + θ 0
RT
For a = 0 we have that, since T1 ∧ T ∼ exp(λ + θ) and 0 e−Js 1{Ns =0} ds =
R T1 ∧T −J
0 e s ds,
Z T b Z T b−1
−Js b −Js
(50) E e 1{Ns =0} ds = E e 1{Ns =0} ds .
0 ηj (b) + λ + θ 0
For a > 0 we have, from the fact that T1 ∧ T is independent of 1{T1 >T } , that
Z T b
−aJT −Js
Ee 1{NT =0} e 1{Ns =0} ds
0
Z T1 ∧T b
−aJT1 ∧T −Js
(51) = Ee 1{T1 >T } e ds
0
Z T1 ∧T b
θ
= Ee−aJT1 ∧T e −Js
ds
λ+θ 0
and thus
Z T b
−aJT −Js
Ee 1{NT =0} e 1{Ns =0} ds
0
Z T b
−aJT −Js
(52) = Ee 1{NT =0} e ds
0
Z T b−1
b
= Ee−aJT 1{NT =0} e −Js
ds .
ηj (a + b) + λ + θ 0
Clearly, when b = 0 and a > 0 we have that
θ
(53) Ee−aJT 1{NT =0} = e−(ηj (a)+λ)T = .
ηj (a) + λ + θ
12 O. KELLA AND M. YOR
Now
Z T n
EXTn = E xe −JT
1{NT =0} + −Js
e 1{Ns =0} ds
0
n Z T n−k
X n k −kJT −Js
(54) = x Ee 1{NT =0} e ds
k 0
k=1
Z T n
−Js
+E e 1{Ns =0} ds ,
0
and denoting [recall (43)]
Z i
X i
(55) µi = ηj (i) + λ = cz i + (1 − (1 − x)i )νz (dx) = ηz(k) (0),
(0,1] k
k=0
it follows from (50), (52), (53) and (54), with some manipulations, that
n n n
!
xk ki=1 µi Y
Q
n n! X µi Y µi
EXT = Qn −
i=1 µi k=1 k!
i=k+1
µi + θ
i=k
µi + θ
(56)
n
!
Y µi
+ ,
µi + θ
i=1
Theorem 3. Let pij (t) be the transition matrix function of a pure death
process D = {Dt |t ≥ 0} with death rates µi , i ≥ 1 (0 is absorbing). Then
n
!
xk ki=1 µi
Q
n n! X
EXt = Qn pn0 (t) + pnk (t)
i=1 µi k=1
k!
(59) "D #
n! Yt xµi
= Qn E D0 = n ,
i=1 µi i
i=1
LINEAR STOCHASTIC EQUATIONS 13
In fact, one may also give a finite simple algorithm with which to compute
EXtn . For the sake of brevity we do it only for the case x = 0. This can be
done similarly to the Brownian motion in the proof of Theorem 1 on page 31
of [22] or, equivalently, directly from (60) as follows. Set f0 = 0 and for n ≥ 1
and 0 < a1 < a2 < · · · < an , let
n
Z Z !
X
fn (a1 , . . . , an ) = · · · exp − ai xi dx1 · · · dxn
Pn i=1
i=1 xi ≤1
x1 ,...,xn ≥0
Pn
Z Z Z 1− i=2 xi
−a1 x1
= ··· e dx1
Pn 0
i=2 xi ≤1
x2 ,...,xn ≥0
(61)
n
!
X
× exp − ai xi dx2 · · · dxn
i=2
Then
gn−1 (b1 + b2 , b3 , . . . , bn ) − e−b1 gn−1 (b2 , b3 , . . . , bn )
(63) gn (b1 , . . . , bn ) = .
b1
14 O. KELLA AND M. YOR
From the above, it is also clear (see also [22], Theorem 1, page 31 for the
case of a Brownian motion) that, in fact,
EXtn = tn n!fn (µ1 t, . . . , µn t)
(64)
= tn n!gn (µ1 t, (µ2 − µ1 )t, . . . , (µn − µn−1 )t)
is a linear combination of exponentials. An algorithm for computing the coef-
ficients of this linear combination is equivalent to the above simple algorithm
which involves only a finite number of additions and multiplications.
We emphasize that the fact that Theorem 3 holds for all n ≥ 1, and the
algorithm for the computation of moments, also valid for all n ≥ 1, is special
for the case where Z is a nonzero subordinator. This is true since this is
the only case where ηj (n) is finite, strictly positive for all n ≥ 1 and strictly
increasing.
REFERENCES
[1] Bertoin, J., Lindner, A. and Maller, R. (2008). On continuity properties of the
law of integrals of Lévy processes. In Séminaire de Probabilités XLI. Lecture
Notes in Math. 1934 137–159. Springer, Berlin. MR2483729
[2] Bertoin, J. and Yor, M. (2005). Exponential functionals of Lévy processes. Probab.
Surv. 2 191–212 (electronic). MR2178044
[3] Bertoin, J., Biane, P. and Yor, M. (2004). Poissonian exponential functionals,
q-series, q-integrals, and the moment problem for log-normal distributions. In
Seminar on Stochastic Analysis, Random Fields and Applications IV. Progress
in Probability 58 45–56. Birkhäuser, Basel. MR2096279
[4] Carmona, P., Petit, F. and Yor, M. (1997). On the distribution and asymptotic
results for exponential functionals of Lévy processes. In Exponential Functionals
and Principal Values Related to Brownian Motion 73–130. Rev. Math. Iberoam.,
Madrid. MR1648657
[5] Carmona, P., Petit, F. and Yor, M. (2001). Exponential functionals of Lévy
processes. In Lévy Processes: Theory and Applications (O. E. Barndorff-
Nielsen, T. Mikosch and S. I. Resnick, eds.) 41–55. Birkhäuser, Boston,
MA. MR1833691
[6] Erickson, K. B. and Maller, R. A. (2005). Generalised Ornstein–Uhlenbeck
processes and the convergence of Lévy integrals. In Séminaire de Probabilités
XXXVIII. Lecture Notes in Math. 1857 70–94. Springer, Berlin. MR2126967
[7] Guillemin, F., Robert, P. and Zwart, B. (2004). AIMD algorithms and exponen-
tial functionals. Ann. Appl. Probab. 14 90–117. MR2023017
[8] Jacod, J. (1979). Calcul Stochastique et Problèmes de Martingales. Lecture Notes in
Mathematics 714. Springer, Berlin. MR542115
[9] Jaschke, S. (2003). A note on the inhomogeneous linear stochastic differential equa-
tion. Insurance Math. Econom. 32 461–464. MR1994504
[10] Kella, O. (1998). An exhaustive Lévy storage process with intermittent output.
Comm. Statist. Stochastic Models 14 979–992. MR1631475
[11] Kella, O. (2009). On growth collapse processes with stationary structure and their
shot-noise counterparts. J. Appl. Probab. 46 363–371.
LINEAR STOCHASTIC EQUATIONS 15
[12] Kella, O., Perry, D. and Stadje, W. (2003). A stochastic clearing model with a
Brownian and a compound Poisson component. Probab. Engrg. Inform. Sci. 17
1–22. MR1959382
[13] Kella, O. and Whitt, W. (1999). Linear stochastic fluid networks. J. Appl. Probab.
36 244–260. MR1699623
[14] Klüppelberg, C., Lindner, A. and Maller, R. (2004). A continuous-time
GARCH process driven by a Lévy process: Stationarity and second-order be-
haviour. J. Appl. Probab. 41 601–622. MR2074811
[15] Lachal, A. (2003). Some probability distributions in modeling DNA replication.
Ann. Appl. Probab. 13 1207–1230. MR1994049
[16] Löpker, A. H. and van Leeuwaarden, J. S. H. (2008). Transient moments of the
TCP window size process. J. Appl. Probab. 45 163–175. MR2409318
[17] Lindner, A. and Maller, R. (2005). Lévy integrals and the stationarity of gener-
alised Ornstein–Uhlenbeck processes. Stochastic Process. Appl. 115 1701–1722.
MR2165340
[18] Lindner, A. and Sato, K.-i. (2009). Continuity properties and infinite divisibility
of stationary distributions of some generalized Ornstein–Uhlenbeck processes.
Ann. Probab. 37 250–274. MR2489165
[19] Nilsen, T. and Paulsen, J. (1996). On the distribution of a randomly discounted
compound Poisson process. Stochastic Process. Appl. 61 305–310. MR1386179
[20] Protter, P. E. (2004). Stochastic Integration and Differential Equations, 2nd ed.
Stochastic Modelling and Applied Probability 21. Springer, Berlin. MR2020294
[21] Yoeurp, C. and Yor, M. (1977). Espace orthogonal á une semi-martingale. Unpub-
lished manuscript.
[22] Yor, M. (2001). Exponential Functionals of Brownian Motion and Related Processes.
Springer, Berlin. MR1854494