0% found this document useful (0 votes)
23 views

Stochastic Processes Examples

Uploaded by

manne5405
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
23 views

Stochastic Processes Examples

Uploaded by

manne5405
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 15
Lecture 3: Useful Stochastic Processes Date: June 6, 2021 Lecturer: Dr Jane Aduda 4.1. Brownian Motion “The obsecvation fist made by the botanist Robert Brown in 1827 , that stall pollen grains suspended in water havea very irregular and unpredictable states of motion, led tothe definition of the Brownian motion, which is formalized in the following: Definition 4.11 (Brownian Motion). A Brownian motion process is a stochastic process B,, 2 0, which satis 1. The process starts atthe origin, B 2. B, has stationary, independent increments: 3. The process B, is continuous in t: 4, ‘The increments 3, ~ B, are normally distributed with mean zero and variance |t ~ +], B, - By~ N(O,|t- sl) The process X; = +B; has all the properties of a Brownian motion that starts at 3. Since 2, — B, is stationary, its distribution function depends only on the time interval t — 5, i.e. P (Bus Bs a) = P (By — By Sa) = P(r Sa) Itis worth noting that even if 2; is continuous, it is nowhere differentiable. From condition 4 we get that B, is normally tributed with mean £ (Bi) = O and Var |; By~ N(O.t) ‘This implies also that the second moment is E [27] = t Let < s < £. Since the increments are independent, we can write E(B,By) = E [(B, - Bo) (Be - B,) + BE = E\B,~ By E[B - B,) + E(B} = 5 Consequently, B, and B, are not independent. Condition 4 also has a physical explanation. A pollen grain suspended in water is kicked by a very large numbers of water molecules. 2 1 molecule on the grain is independent of the other molecules, The influence of “These effects are average out into a tevultant increment ofthe grin coordinate. to be normal dist ‘According to the Central Limit ‘Theorem, this increment Proposition 4.1.1 (11). A Brownian motion process B; is a martingale with respect to the information set F=0(Bas so. ! Proof. The integrability of B; follows from Jensen’s inequality E||BilP? < B (2) = Var (B.) = Ith < 20 By is obviously F; -predictable. Let s < ¢and write By = B, + (Br — B,). Then E(B | Fi = 1B. + (Be Bs) Fs) E (By | Fal +B (Bi — Be | Fs) = B, + BB — By) = B, + B|Bi-e - Bol = Bs where we used that B, is F -predictable (rom where £{B, | 7] = B,) and that the increment By ~ B, is Jpdependent of previous values of By contained in te information set Fx = (Bus $f) o |A process with similar properties asthe Brownian motion was introduced by Wiener, Definition 4.1.2 (Wiener process). A Wiener process Wis a process adapted 0 @ filtration F; such that 1. The process starts at the origin, Wo 2. Weis an F;-martingale with E (W?] < co for allt > Oand E(w - WA] ssh 3, The process 1 is continuous int. Since W is a martingale, its increments are unpredictable and hence |W, — W,] = 0% in particular £ [WV] = 0. tis easy to show that Var(W,- WJ =|t—s|, Var (Wd = “The only property Bi has and W seems not to have is tha the increments are normally distributed. However, there is no distinction between these two processes, asthe following result states. “Theorem 4.1.3 (Lévy). A Wiener process is a Brownian motion process. In stochastic caleulus we often need to use infinitesimal notation and its properties. If dV; denotes the infinitesimal increment of a Wiener process, in the time interval dt, the aforementioned properties become dW, ~ N(0, dt), E [dW] = 0. and & ((a,y"] = at. Proposition 4.1.2 (12). If is @ Wiener process with respect to the information set Fe then Y= We - tis a martingale. Proof. Y, is integrable since EIMISE[W?+)=20 Lets < , Using that the increments 1 W. and (IV, ~ 11)? are independent of the information set Fu then £(W? |) = 2 (W, + Wy Wy? 2 [W? + 20, (IN, — 1) + (IN, — 1? Fi E(W?| F] +e pw ay, - |FAJ+ el = WE + QE IN, | Ae [1 = WP +22 (, - + Ef, =W? sts, and hence E(W? —¢ | F,) = 2 vfors (is called geometric Brownian m The following result will be useful in the following Lemma 4.2.1 (16). E [e*"] =e", fora > 0. Proof. Proof: Using the definition of expectation Efe where we have used the integral formula a>0 witha = }andb =a. o Proposition 4.2.1 (17). The geometric Brownian motion X, and variance & ~ e'. 30 red, then X, = 6 wil ave a log ora listrittion, Usiaye Lerma jistibted, then Ni = ¢ E(Xd = Ele] © eli ele |ne Vj is nonmally ai Vorb = [XT] = LMP (" a J: it to the distribution function of a Brownian The distibution Funct _ Bila = PU Sad P(E ") Py stn) = Bill) an motion Xy = eis en by inction of the geometric Bro a ple) = gp Fal) {k ‘The density Leetmnven, ifr > 0 elsewl 4.3. Integrated Brownian Motion. The stochastic process a [Wats 20 0, js called integrated Brownian motion, Obviow towith sy = ‘Then Z, ean be written as a Himit of Riemann sums Let = s9 <8 S00 S 84S - Way tect Wag A fing So Wade in. = where A: We are tempted to apply the Central Limit'Theorem at this point, but 1’,, are wot independent, so we first need to transform the sum into # sum of independent normally distributed ranclom variable: A straightforward computation shows that Wate eM, =n (Wy, — Wad + (0 = Day = Wa) bk (Wa = We a) an HN Xp tet Xn Since the increments of a Brownian motion are independent and normally distributed, we have 1 N (On?) aL Xp~ N(0,(n = 1°48) Xy~ N {0 (n= 2748) Xq~ N(0.As) Recall now the following variant ofthe Central Limit Theorern: “Theorem 43.1 (18). FX) are independent random variables normally distributed with mean yt and variance Fe he the sum Xy-+++o-Xq isalso normally distributed with mean ja ~~ + Jin and variance a} +++ 0%. m n(n +1)2n+ 1) Nye bX N(R +S HMDS x(o aint n+) 2 as) with As Using (4.1) yields (wn bet Wey ~n (0 (n+ 1)Qn +1) ’) 7" Gn? z~N (0% aw N (0,5 Propusition 43.1 (19). The integrated Brownian motion Z, has « normal distribution with mean O and vari ance t3/3. "Taking the limit" we get Remark 43:2 (20), ‘The aforementioned limit was taken heuristically, without specifying the type of the con- ‘vergence. In order to make this to work, the following result is usually used: If X, is asequence of normal random variables that converges in mean square to X, then the limit X is normal distributed, with E [X,] + E[N] and Var (X,) ~ Var(X), a8 2 + 00. “The mean and the variance can also be computed in a direct way as follows. By Fubini’s theorem we have Biaj=2[ [wa] = [ f asar = ff [aris= f'ewaas 0, since £ [W,] = 0. Then the variance is given by Var [2] = & [27] - Bla) = & [27] -e[[ Wede f a] =f [ warts =f} f 2080 dude = LE an” v}dudo (42) -ff. minguodaude + ff min{u, v}dude where Di = {(uy»)iu > v0 S usd, Dr= evaluated using Fubini's theorem: {(u.u}su < 0,0 0, is called Brownian motion with drift. The process ¥; tends to drift off ata rate y Ik starts at Yo = O and it is a Gaussian process with mean Ely t+ EW) and variance Var [Yi] = Var [ut + We] = Var [We] = ¢ 4.7. Bessel Process ‘This section deals with the process satisfied by the Euclidean distance from the origin to a particle following a Brownian motion in R”, More precisely, if Wi(t),--- , W(t) are independent Brownian motions, let 1V(t) = (Wi(t),-++ , Wa(t)) be a Brownian motion in R",n > 2, The process Ry = dist(O,1V(Q) = Vt FOE is called n-dimensional Bessel process. ‘The probability density of this process is given by the following result. Proposition 4.7.1 (21). The probability density function of Ry,t > Ois given by with forneven oHvi, fornodd Proof. Since the Brownian motions W,(t),...., Wa(t) are independent, their joint density function is Surctin(t) = Sand) fll) Detected = eM, 50, o In the next computation we shall use the follo\ coordinates ig formula of integration that follows from the use of polar f She)dx = 0 (S*) Kise) 4 Ngo )de whote £0) = ght) ts a tinetion @ He" with spherieal symmetry, and where fsthe atew ot the (7) 1) aliniensional sphere i fe Let 90 The distribution finetion of Fy bs ied) = PUR Sp) firvoang(t)eny ++-tbey Fu) = BU Sy I, pie ate [ U(r AMEN GE -saley Hh enakset . "yal te tt "N40 dr [ ' (L y Cayo? 2 tin Ditlerenti Baw Wis worth noting th mensional case the aforen Weibull distribution with parameters m = 2 and e p>ojt>0 tioned density becomes a particular case of a 21, called Wald’s distribution i, £>0t>0 1 le) = Fe 4.8. The Poisson Process 4.8.1. General Introduction A Poisson proce 8 describes the number of occurrences of a certain event before time t, such as: 1. the number of electrons arriving at an anode until time ts ihe number of ears artiving a a gas station until time t; the number of phone cals received on a certain day until time ¢; 3 the number of visitors entering « museum on a certain day until time t: + the number of earthquakes that occurred in Chie during the time interval (oa: the number of shocks in the stock matket from the beginning of the year until time fy 6 the number of twisters that might hit Alabama from the begi ning of the century until time t. 4.8.2. Detinition and Properties Definition 4.8.1 (A Poisson process). A Poisson process isa stochastic process Ny, > 0, which satisfies 1. The process starts at the origin, Ng =O; Ny has stationary, independent increments; 3. The process 1 is right continuous in t, with left hand li 4. The increments 1, ~ Na with 0 < s < ¢, have a Poisson distribution with parameter A(t — 5) _XU- 9 a P(M-N,=k) eon, ‘tea be shown that condition 4 in the previous definition can be replaced by the following two conditions: P(N—N,=1) = ML=s) +0(t—s) 43) P(M—N, 2 2) = oft - 5) (44) Where o(h) denotes a quantity such that limy49 0(h)/h = 0. ‘Then the probability that a jump of size 1 occurs in the infinitesimal interval dis equal to Adt, and the proba- bility that at least 2 events occur in the same stnall interval is zero, ‘This implies that the random variable d’V, may take only two values, 0 and 1, and hence satisfies P(dN, = 1) = Xd P (aN, = 0) = 1 — dat. ‘The fact that N; — N, is stationary can be stated as (4) (4.6) P (Neos — Ny Sn) = P(N ~ No $n) = P(N 0) and {\; = 0} are the same, since both describe the occurred until after time f situation that no ev Then P(N >) =P(M and hence the distribution function of 7; is FAQ= PTs) ating yields the density function frlt) = FE n(t) = de™ It follows that 7; is has an exponential distribution, with E [7;] = 1/2. In order to show that the random variables 7, and Tp are independent, if suffices to show that ) PR <)=P(RSh= i.e, the distribution function of 7} is independent of the values of T,. ‘We note first that from the independent increments property P(O jumps in (s, 8+ t),1 jump in (0,5]) = P (Neve — Ne = 0,0, = P (Nasi ~ Na = 0) P (Ne — No = 1) = P(O jumps in (s,s + t]) PC jump in (0.s}) . ‘Then the conditional distribution of Tp is F(t\|s)=P(T,St|T =s) P(T >t|T=s) + 1))P(_ jump in (0,}) PL jump in (0, }) P (Ne Ny= 0) =1-e™ which is independent of s. ‘Then 7 is independent of 7}, and exponentially distributed, A similar argument for any T;, leads to the desired result 38 4.8.4. Waiting times £7, is called the waiting time until the nth jump. =hth The random variable Sy 1) amcans that there are n jumps that averted before or atime f ive. there are at least n uae bate ‘event is equal to {Ny 2") ‘events that happened up to time f the function of iven by Hence the distributis as 0 “yO isn Fe,(t) = PS $0 = P(N 2) = ce obtain the density function of the waiting time Sy, Ditferent dn gy deta" Ssalt) = Foul) = ay Weng . ple fal = OTTTR) jtturns out that S,, has a gamma distribution with parameters a = n and 8 = 1/2. Ie follows that ® jon of the waiting time is unbounded as n + co ‘The relation litn, 9c E [Su] = 00 states that the expect 4 s 8, es ‘The Poisson process .V; and the waiting times $}, S2,... Su. The shaded rectangle has area n(Sne1 ~#). 4.8.5. The Integrated Poisson Process ‘The function u ++‘, is continuous with the exception of a set of countable jumps of size 1. 39 Inte, so it makes av such funetinny are Rictnant define the process Ins know te fs dn called the integrated Poisson process The neat result provides # relation hetween the process primes 5 Proposition 48 OU, The integrated Poisson process can be expressed as m =m - S58 ns. Sine Ny is equal of bewwe sof the subgraph of Ny hetween 0 He fade 165, Since Sy < F< Sy yy, the difference of the la the length ¢ nd the height 1, the waiting times 5p and 51, the process Us, whi ul f, can be expressed s equal to JAB (Sy = Sy) beret it Lwo Lerms represents the area of last the rectangle, which has Using associativity, a computation yields in) = WS ng — (Sp + Sz toe + Su) Vs (Sp = $1) 42+ (4 ~ 82) hoe bMS — Substituting in the afo rentioned relation yields Up = WS — (SpA S240 + Sq) — (Sno — 2) nt ~ (Si + Sat si IN, ~ SOS a where we replaced n by Ny. The conditional distribution of the waiting times is provided by the following useful result, ‘Theorem 4.8.2, Given that Ny ~ 1, the waiting times 8,8, by have the the joint density function given ! u Fho4, 990006 oq) = O Nit. > 0 (future predictions exceed the present value). Example 49.1. integrability follor X; is obvious, a hall prove that the process X~ = tl + oW,, with > 0 is a submartingale, The from the inequality |X,(ve)| < jt + |WVi(w)| and integrability of W,. The adaptability of 4 the last property follows from the computation: Fil = Blatt oWiee | Fe = wt +cE (Wiss | Fa + ys > Elpt+ ol. | Fi] ult ol =X where we used that Wy is a martingale. We shalt show that the square of the Brownian motion, W2, is a submartingale, Using that 2 = t is amartingale, we have EWAAAF] =£[W2,- (+9) |Alet+s=We-tters We+s>W? ‘The following result supplies examples of submatingales starting from martingales or submartingales, Proposition 49.1(25). 1. If X; is a martingale and 6 a convex function such that 6 (X,) is integrable, then the process Y, = 9 (X,) is a submartingale. 2. IPXc is a submartingale and o an increasing convex function such that ¢(X,) is integrable, then the process Yi = 9(N,) isa submartingale J. Using Jensen’s inequality for conditional probabilities, Exercise 1.12.6, we have E Wise | Fi] = [6 (Xess) | Fl 2 0 (E [Xue | Fl) = 0(X) = % o 2. From the submatingale property and monotonicity of ¢ we have O(E [Xess | Fil) 2 0 ‘Then apply a similar computation as in part (I). o Corottary 4.9:3(26). 1. Let X, be amartingale. Then X?,|X,| ,e% are submartingales. 2. Let >0.Then °°" isa submaringate J, Results from part 1 of Proposition 25 a 2. Results from part 2 of Proposition 25 a 41

You might also like