Sie sind auf Seite 1von 42

Solutions to the Exercises in Stochastic Analysis

Lecturer: Xue-Mei Li

1 Problem Sheet 1
In these solution I avoid using conditional expectations. But do try to give alterna-
tive proofs once we learnt conditional expectations.
d |x−y|2
Exercise 1 For x, y ∈ Rd define pt (x, y) = (2πt)− 2 e− 2t . Prove that Pt (x, dy) =
pt (x, y)dy satisfies the Chapman-Kolmogorov equation: for Γ Borel subset of Rd ,
Z
Pt+s (x, Γ) = Ps (y, Γ)Pt (x, dy).
Rd

Solution: One one hand


|x−z|2
Z
− d2 − 2(t+s)
Pt+s (x, Γ) = (2π(t + s)) e dz.
Γ

On the other hand,


Z
Ps (y, Γ)Pt (x, dy)
Rd
Z Z
d d |y−z|2 |y−x|2
= (2πt)− 2 (2πs)− 2 e− 2s e− 2t dzdy.
Rd Γ

We now complete the squares in y.

|y − z|2 |y − x|2 tz + sx 2 1 |x − z|2



1 t + s
− − =− y− − .
2s 2t 2 st t+s 2 t−s

1
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 2

tz+sx
next we change the variable y − t+s to ỹ, then
Z
Ps (y, Γ)Pt (x, dy)
Rd
2
Z Z
1 t+s 1 |x−z|
− d2 − d2 2
= (2πt) (2πs) e− 2 st |ỹ| e− 2 t−s dzdỹ
d
ZR Γ Z 2
d d 1 t+s 2 1 |x−z|
= (2πt)− 2 (2πs)− 2 e− 2 st |ỹ| dỹ e− 2 t−s dz
d
Z R |x−z|2 Γ

= (2π(t − s))−f d2 e 2(t−s) dz.
Γ

Exercise 2 Let (Xt , t ≥ 0) be a Markov process with X0 = 0 and transition func-


tion pt (x, y)dy where pt (x, y) is the heat kernel. Prove the following statements.

(1) For any s < t, Xt − Xs ∼ Pt−s (0, dy);

(2) Prove that (Xt ) has independent increments.

(3) For every number p > 0 there exists a constant c(p) such that
p
E|Xt − Xs |p = c(p)|t − s| 2 .

(4) State Kolomogorov’s continuity theorem and conclude that for almost surely
all ω, Xt (ω) is locally Hölder continuous with exponent α for any number
α < 1/2.

(5) Prove that this is a Brownian motion on Rd .

Solution: Let f be a bounded measurable function.


(1) Since (xs , xt ) ∼ Ps (0, dx)Pt−s (x, dy),
Z Z
Ef (xt − xs ) = f (y − x)ps (0, x)pt−s (x, y)dxdy
R d Rd
Z Z
= f (z)ps (0, x)pt−s (0, z)dxdz
d d
ZR R Z Z
= f (z)pt−s (0, z)dz ps (0, x)dx = f (z)pt−s (0, z)dz.
Rd Rd Rd

Hence xt − xs ∼ Pt−s (0, dz).


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 3

(2) Let us fix 0 = t0 < t1 < t2 < . . . < tn and Borel sets Ai ∈ B(R),
i = 1, . . . , n. Let fi (x) = 1x∈Ai where Ai are Borel measurable set. Then we
obtain

P(Xt1 ∈ A1 , . . . , Xtn − Xtn−1 ∈ An )


Z Z
= ... f1 (x1 )f2 (x2 − x2 ) . . . fn (xn − xn−1 )
R R
× pt1 (0, x1 )pt2 −t1 (x1 , x2 ) . . . ptn −tn−1 (xn−1 , xn ) dxn . . . dx1 ,

where in the last line we have used the identity (ii). Introducing new variables:
y1 = x1 , y2 = x2 − x1 , . . ., yn = xn−1 − xn , we obtain
Z Z
P(Xt1 ∈ A1 , . . . , Xtn − Xtn−1 ∈ An ) = ... f1 (y1 )f2 (y2 ) . . . fn (yn )
R R
× pt1 (0, y1 )pt2 −t1 (0, y2 ) . . . ptn −tn−1 (0, yn ) dyn . . . dy1
n Z
Y
= fi (yi )pti −ti1 (0, yi ) dyi .
i=1 R

This means that {Xti −ti−1 , i = 1, . . . , n} are independent random variables.


(3)
|z|2
Z
p 1 p − 2(t−s)
E|xt − xs | = d |z| e dz
(2π(t − s)) 2 Rd
Z
1 p |y|2 d
p − 2
= d (t − s) 2 |y| e |t − s| 2 dy
(2π(t − s)) 2 Rd
Z
p 1 |y|2
p − 2
= (t − s) 2 d |y| e dy.
(2π) 2 Rd

The integral is finite. Let


Z
c(p) = |y|p p1 (0, y)dy,
Rd

which is the pth moment of a N (0, Id×d ) distributed variable.


(4) and (5) are straight forward application of Kolmogorov’s theorem. 

Exercise 3 If (Bt ) is a Brownian motion prove that (Bt ) is a Markov process with
transition function pt (x, y)dy.
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 4

Solution: Let us denote for simplicity fi (x) = 1x∈Ai . Furthermore, we define the
random variables Yi = Xti − Xti−1 , for i = 1, . . . , n, where we have postulated
t0 = 0. From the properties of the Brownian motion we obtain that the random
variables {Yi } are independent and moreover Yi ∼ N (0, ti − ti−1 ). Thus, we have
n
Y
P[Xt1 ∈ A1 , . . . , Xtn ∈ An ] = E fi (Xti )
i=1
= E[f1 (Y1 )f2 (Y2 + Y1 ) . . . fn (Yn + . . . + Y1 )]
Z Z
= ... f1 (y1 )f2 (y2 + y1 ) . . . fn (yn + . . . + y1 )
R R
× pt1 (0, y1 )pt2 −t1 (0, y2 ) . . . ptn −tn−1 (0, yn ) dyn . . . dy1 .

Now we introduce new variables: x1 = y1 , x2 = y2 + y1 , . . ., xn = yn + . . . + y1 ,


and obtain that the last integral equals
Z Z
. . . f1 (x1 )f2 (x2 ) . . . fn (xn )
R R
× pt1 (0, x1 )pt2 −t1 (0, x2 − x1 ) . . . ptn −tn−1 (0, xn − xn−1 ) dxn . . . dx1 .

Noticing that pt (0, y − x) = pt (x, y) and recalling the definition of the functions
fi , so the finite dimensional distribution agrees with that of the Markov process
determined by the heat kernels. The two processes must agree.


Exercise 4 Let (Xt , t ≥ 0) be a continuous real-valued stochastic process with


X0 = 0 and let pt (x, y) be the heat kernel on R. Prove that the following state-
ments are equivalent:

(i) (Xt , t ≥ 0) is a one dimensional Brownian motion.

(ii) For any number n ∈ N, any sets Ai ∈ B(R), i = 1, . . . , n, and any 0 <
t1 < t2 < . . . < tn ,

P[Xt1 ∈ A1 , . . . , Xtn ∈ An ]
Z Z
= ... pt1 (0, y1 )pt2 −t1 (y1 , y2 ) . . . ptn −tn−1 (yn−1 , yn ) dyn . . . dy1 .
A1 Ak

Solution: This follows from the previous exercises. 


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 5

Exercise 5 A zero mean Gaussian process BtH is a fractional Brownian motion of


Hurst parameter H, H ∈ (0, 1), if its covariance is
1
E(BtH BsH ) = (t2H + s2H − |t − s|2H ) .
2
Then E|BtH −BsH |p = C|t−s|pH . If H = 1/2 this is Brownian motion (Otherwise
this process is not even a semi-martingale). Show that (BtH ) has a continuous
modification whose sample paths are Hölder continuous of order α < H.
Solution: Since E|BtH − BsH |p = C|t − s|pH = C|t − s|1+(pH−1) , we can apply
the Kolmogorov continuity criterion to obtain that BtH has a modification whose
sample paths are Hölder continuous of order α < (pH − 1)/p. This means that for
any α < H we can take p large enough to have α < (pH − 1)/p. This finishes the
proof. 

Exercise 6 Let (Bt ) be a Brownian motion on Rd . Let T be a positive number.


For t ∈ [0, T ] set Yt = Bt − Tt BT . Compute the probability distribution of Yt .
Solution:
t
Ef (Bt − BT )
T
2
Z Z  
t d |x|2 d − |y−x|
= f x − y (2πt)− 2 e− 2t (2π(T − t))− 2 e 2(T −t) dxdy.
Rd Rd T

We observe that
t 2 2t t2

|x| = x − y + hx, yi − 2 |y|2
2


T T T
and that
2
2t(T − t) 2 (T − t)2 2

2
t (T − t)
|x − y| = x − y − 2 hx, yi + |y| + |y| .
T T T2 T2

Thus,

|x|2 |y − x|2 t 2 t 2 1 2

1 1
+ = x − y + x − y + |y| .
t (T − t) t T T −t T T
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 6

Finally
t
Ef (Bt − BT )
T Z Z  
t 2 2 |y|2
f x − y e− 2t |x− T y| e 2(T −t) | T | e− 2T dxdy
− d2 − d2 1 t − 1 x− t y
= (2πt) (2π(T − t))
d d T
ZR R Z
d d 1 1 2 |y|2
|z|2 − 2(T −t) |z|
= (2πt)− 2 (2π(T − t))− 2 f (z)e − 2t
e dz e− 2T dy
Z Rd Rd
d
= f (z)pt (0, z)pT −t (z, 0)dz · (2πT ) 2 .
Rd

Finally we see
t pt (0, z)pT −t (z, 0)
Bt − BT ∼ dz.
T pT (0, 0)

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 7

2 Brownian motion, conditional expectation, and uniform


integrability
Exercise 7 Let (Bt ) be a standard Brownian motion. Prove that

(a) (i) for any t > 0, E[Bt ] = 0 and E[Bt2 ] = t;


(ii) for any s, t ≥ 0, E[Bs Bt ] = s ∧ t, where s ∧ t = min(s, t).

(b) (scaling invariance) For any a > 0, √1 Bat is a Brownian motion;


a

(c) (Translation Invariance) For any t0 ≥ 0, Bt0 +t − Bt0 is a standard Brownian


motion;

(d) If Xt = Bt − tB1 , 0 ≤ t ≤ 1, then E(Xs Xt ) = s(1 − t) for s ≤ t. Compute


the probability distribution of Xt .
Hint: break Xt down as the sum of two independent Gaussian random vari-
ables, then compute its characteristic function).

Solution: (a) (i) Since the distribution of Bt is N (0, t), we have E[Bt ] = 0 and
E[Bt2 ] = t.
(a) (ii) We fix any t ≥ s ≥ 0. Then we have

E[Bs Bt ] = E[(Bt − Bs )Bs ] + E[Bs2 ] .

Since the Brownian motion has independent increments, the random variables Bt −
Bs and Bs are independent and we have

E[(Bt − Bs )Bs ] = E[Bt − Bs ]E[Bs ] = 0 .

Furthermore, from (i) we know that E[Bs2 ] = s. Hence, we conclude

E[Bs Bt ] = s ,

which is the required identity.


(b) Let us denote Wt = √1a Bat . Then W has continuous sample paths and in-
dependent increments, which follows from the same properties of B. Furthermore,
for any t > s ≥ 0, we have

1 a(t − s)
Wt − Ws = √ (Bat − Bas ) ∼ N (0, ) = N (0, t − s) ,
a a

which finishes the proof.


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 8

(c) If we denote Wt = Bt0 +t − Bt0 , then W has continuous sample paths and
independent increments, which follows from the same properties of B. Moreover,
for any t > s ≥ 0, we have

Wt − Ws = Bt0 +t − Bt0 +s ∼ N (0, (t0 + t) − (t0 + s)) = N (0, t − s) ,

which finishes the proof.


(d) For t = 0 or t = 1 we have Xt = 0. For 1 > t ≥ s > 0 we have

E[Xt Xs ] = E[(Bt − tB1 )(Bs − sB1 )]


=E[Bt Bs ] − sE[Bt B1 ] − tE[B1 Bs ] + stE[B1 B1 ]
=s − st − st + st = s(1 − t) .

Let us take t ∈ (0, 1). Then we have Xt = (1 − t)Bt − t(B1 − Bt ). Since


the random variables Bt and B1 − Bt are independent, the distribution of Xt is
N (0, (1 − t)2 t + t2 (1 − t)) = N (0, t(1 − t)). 

Exercise 8 Let X ∈ L1 (Ω, F, P ). Prove that the family of random variable


{E{X|G} : G ⊂ F} is L1 bounded, i.e. supG⊂F E (|E{X|G}|) < ∞.
Solution: Let us take any G ⊂ F. Then using the Jensen’s inequality we have

E (|E{X|G}|) ≤ E (E{|X||G}) = E|X| < ∞ ,

which proves the claim. 

Exercise 9 Let X ∈ L1 (Ω; R). Prove that the family of functions

{E{X|G} : G is a sub σ-algebra of F}

is uniformly integrable.
Solution: We note that, if a set of measurable sets AC satisfies limC→∞ P[AC ] =
0, then limC→∞ E[1AC |X|] = 0 ( since X ∈ L1 , dominated convergence).
Let us take any G ⊂ F and consider the family of events

A(C, G) = {ω : |E[X|G](ω)| ≥ C} .

Applying the Markov’s and Jensen’s inequalities we obtain

P(A(C, G)) ≤ C −1 E (|E[X|G]|) ≤ C −1 E[E[|X||G]] = C −1 E|X| → 0 ,

as C → ∞ ( since E|X| < ∞).


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 9

For any ε > 0, there exist a δ > 0 such that if P[A] < δ, then E(1A |X|) < ε.
For this δ, take C > E|X|
δ , then P[A(C, G)] < δ for any G ⊂ F, which implies

sup E[1A(C,G) |X|] < ε.


G⊂F

Finally, we conclude

sup E[1A(C,G) |E[X|G]|] ≤ sup E[1A(C,G) |X|] < ε ,


G⊂F G⊂F

which proves the claim. 

Exercise 10 Let (Gt , t ≥ 0), (Ft , t ≥ 0) be filtrations with the property that Gt ⊂
Ft for each t ≥ 0. Suppose that (Xt ) is adapted to (Gt ). If (Xt ) is an (Ft )-
martingale prove that (Xt ) is an (Gt )-martingale.
Solution: The fact that (Xt ) is (Gt )-adapted, follows from the inclusion Gt ⊂ Ft
for each t ≥ 0. Furthermore, for any t > s ≥ 0, using the tower property of the
conditional expectation, we obtain

E[Xt |Gs ] = E[E[Xt |Fs ]|Gs ] = E[Xs |Gs ] = Xs .

This shows that (Xt ) is an (Gt )-martingale. 

Exercise 11 (Elementary processes) Let 0 = t0 < · · · < tn < tn+1 , Hi be


bounded Fti -measurable functions, and
n
X
Ht (ω) = H0 (ω)1{0} (t) + Hi (ω)1(ti ,ti+1 ] . (1)
i=1

Prove that H : R+ × Ω → R is Borel measurable. Define the stochastic integral


Z t n
X
It ≡ Hs dMs ≡ Hi (Mti+1 ∧t − Mti ∧t )
0 i=1

and prove that


Z t  Z t 2 Z t 
2
E Hs dBs = 0 , E Hs dBs =E (Hs ) ds . (2)
0 0 0
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 10

Solution: First, we will prove that the function (1) is Borel-measurable. To this
end, we take any Borel set A ∈ B(R) and we have to show that the set {(t, ω) :
H(t, ω) ∈ A} is measurable in the product space (R+ , B(R+ )) × (Ω, F). We can
rewrite this set in the following way:
{(t, ω) : H(t, ω) ∈ A} = ({0} × {ω : H0 (ω) ∈ A})
∪ni=1 ((ti , ti+1 ] × {ω : Hi (ω) ∈ A}) .
Since the sets {0} and (ti , ti+1 ], i = 1, . . . , n, belong to B(R+ ), and {ω : Hi (ω) ∈
A} ∈ Fti ⊂ F, the claim now follows from the fact that the product of two
measurable sets is measurable in the product space. Rt
Next, we will show the identities (2). Let us denote It = 0 Hs dBs . Then we
have
n
X n
  X  
E[It ] = E Hi (Bti+1 ∧t − Bti ∧t ) = E [Hi ] E Bti+1 ∧t − Bti ∧t = 0 ,
i=1 i=1

where in the second equality we have used the independence of Bti+1 ∧t − Bti ∧t
from Fti , which follows from the properties of the Brownian motion and the fact
 Bti+1 ∧t − Bti ∧t = 0 if ti ≥ t. Furthermore, in the last equality we have used
that
E Bti+1 ∧t − Bti ∧t = 0.
For the variance of the stochastic integral we have
n
X
E[It2 ] = E Hi2 (Bti+1 ∧t − Bti ∧t )2
 

i=1
X  
+ E Hi Hj (Bti+1 ∧t − Bti ∧t )(Btj+1 ∧t − Btj ∧t ) .
i6=j

In the same way as before, using the independence of the increments of the Brow-
nian motion, we obtain that the second sum is 0. Thus, we have
n
X
E[It2 ] = E Hi2 (Bti+1 ∧t − Bti ∧t )2
 

i=1
n
X
E Hi2 E (Bti+1 ∧t − Bti ∧t )2
   
=
i=1
n
X Z t 
E Hi2 (ti+1 ∧ t − ti ∧ t) = E (Hs )2 ds ,
 
=
i=1 0

where in the second line we have used the fact that Hi2 and (Bti+1 ∧t − Bti ∧t )2 are
independent. 
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 11

3 Martingales and Conditional Expectations


A given non-specific filtration {Ft } is used unless otherwise stated.

Exercise 12 Let µ be a probability measure. If {Xn } is u.i. and Xn → X in


measure, prove that Xn is L1 bounded and X ∈ L1 .
R
Solution: By the u.i., there exists a number C such that |Xn |≥C |Xn |dµ ≤ 1.
Z Z Z
|Xn |dµ ≤ |Xn |dµ + |Xn |dµ ≤ 1 + C.
|Xn |≥C |Xn |≤C

Take an almost surely convergence sub-sequence if necessary, we may and will


assume that Xn → X. Then
E[|X|] = E[lim inf |Xn |] ≤ lim inf E|Xn | ≤ sup E|Xn |,
n→∞ n→∞ n

which is finite. 

Exercise 13 If X is an L1 function, prove that Xt := E{X|Ft } is an Ft -martingale.


Solution: By the definition of conditional expectations, each Xt ∈ L1 . By the
definition Xt ∈ Ft for each t. By the tower property if s < t, E(Xt |Fs ) =
E(E(X|Ft )|Fs ) = E(X|Fs ) = Xs . 

Exercise 14 (Discrete Martingales) If (Mn ) is a martingale, n ∈ N, its quadratic


variation is the unique process (discrete time Doob-Meyer decomposition theorem)
hM in such that hM i0 = 0 and Mn2 − hM in is a martingale. Pn Let (Xi ) be a family
2
of i.i.d.’s with EXi = 0 and E(Xi ) = 1. Then Sn = i=1 Xi is a martingale
w.r.t. its natural filtration. This is the random walk.
Prove that Sn is a martingale and that its quadratic variation is hSin = n.
Solution: Let Fn denotes the natural filtration of {Xn }. Then
E{Sn |Fn−1 } = E{Sn−1 |Fn−1 } + E{Xn |Fn−1 } = Sn−1 + 0.
We used the fact that Xn is independent of Fn−1 . Similarly,
E{(Sn )2 − n|Fn−1 }
= E{(Sn−1 )2 |Fn−1 } + 2E{Sn−1 Xn |Fn−1 } + E{(Xn )2 |Fn−1 } − n
= (Sn−1 )2 + 2Sn−1 E{Xn |Fn−1 } + E[(Xn )2 ] − n
= (Sn−1 )2 − (n − 1).
So (Sn )2 − n is an Fn martingale. 
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 12

Exercise 15 Let φ : Rd → R be a convex function. Show that


(a) If (Xt ) is a sub-martingale and φ is increasing s.t. φ(Xt ) ∈ L1 , then (φ(Xt ))
is a sub-martingale.
(b) If (Xt ) is a martingale and φ(Xt ) ∈ L1 , then φ(Xt ) is a sub-martingale.
(c) If (Xt ) is an Lp integrable martingale, for a number p ≥ 1, prove that (|Xt |p )
is a sub-martingale.
(d) If (Xt ) is a real valued martingale, prove that (Xt ∨ 0) is a sub-martingale.
Solution: Since φ is convex, it is continuous and hence it is measurable. This
means that in what follows the process φ(Xt ) is adapted.
(a) For t > s, using the Jensen’s inequality we obtain
E[φ(Xt )|Fs ] ≥ φ(E[Xt |Fs ]) ≥ φ(Xs ) ,
where in the last inequality we have used E[Xt |Fs ] ≥ Xs and the fact that φ is
increasing.
(b) The claim can be shown in the same way as (a), but now
φ(E[Xt |Fs ]) = φ(Xs ) .
(c), (d) The claims follow from (b) and the fact that the maps x 7→ |x|p and
x 7→ x ∨ 0 are convex. 

Exercise 16 (Limit of Martingales) Let {(Mn (t), t ∈ [0, 1]), n ∈ N} be a fam-


ily of martingales. Suppose that for each t, limn→∞ Mn (t) = Mt almost surely
and {Mn (t), n ∈ N} is uniformly integrable for each t. Prove that M (t) is a
martingale.
Solution: For any t ∈ [0, 1] and any A ∈ F fixed, the family {Mn (t)1A , n ∈ N}
is uniformly integrable. Indeed,
Z Z
lim sup |Mn (t)|1A dP ≤ lim sup |Mn (t)| dP = 0 ,
C→∞ n {|Mn (t)|1A ≥C} C→∞ n {|Mn (t)|≥C}

where the inequality follows from |Mn (t)|1A ≤ |Mn (t)| and the last equality
follows from the fact that {Mn (t), n ∈ N} are uniformly integrable.
Since, limn→∞ Mn (t)1A = Mt 1A almost surely, the uniform integrability
implies convergence in L1 . Take A = Ω, we see that Mt is integrable. For any
0 ≤ s < t ≤ 1 and any A ∈ Fs we have
E[M (t)1A ] = lim E[Mn (t)1A ] = lim E[Mn (s)1A ] = E[M (s)1A ] ,
n→∞ n→∞
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 13

where the second equality holds, because Mn is a martingale. This implies that

E[M (t)|Fs ] = M (s) ,

which finishes the proof. 

Exercise 17 Let a > 0 be a real number. For 0 = t1 < · · · < tn < tn+1 < . . .
and i = 1, . . . , n, . . . let Hi be bounded Fti measurable functions and, let H0 be a
bounded F0 -measurable function. Define

X
Ht (ω) = H0 (ω)1{0} (t) + Hi (ω)1(ti ,ti+1 ] (t).
i=1

Let (Mt , t ≤ a) a martingale with M0 = 0, and (Ht ) an elementary process.


Define the elementary integral
Z t ∞
X
It ≡ Hs dMs ≡ Hi (Mti+1 ∧t − Mti ∧t ).
0 i=1

Prove that (It , 0 ≤ t ≤ a) is an Ft - martingale.


Solution: Note: The sum is always a finite sum. Please refer also to Exercise 11.
It is clear that Mti+1 ∧t − Mti ∧t ∈ Ft . There exists an n s.t. t < tn+1 . Then

Mti+1 ∧t − Mti ∧t = 0, ∀i ≥ n + 1.
Z t n
X
Hs dMs ≡ Hi (Mti+1 ∧t − Mti ∧t ),
0 i=1

and for 1 ≤ i ≤ n, Hi ∈ Fti ⊂ Ft . In particular, It ∈ Ft for each t and the


process (It ) is (Ft )-adapted.
The finite number of bounded random variables {|Hi |, i = 1, . . . , n} is bounded
by a common constant C. Furthermore, since each s, Ms is integrable,
Z t n n
X  X 
E Hs dMs ≤
E |Hi ||Mti+1 ∧t − Mti ∧t |] ≤ C E|Mti+1 ∧t | + E|Mti ∧t | < ∞,
0 i=1 i=1

proving that for each t, It is integrable.


Finally, we prove the martingale property. First note that if (Ms , s ≥ 0) is a
martingale then
E{Mt |Fs } = Ms∧t , ∀s, t ≥ 0 (3)
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 14

Take t ≥ s ≥ 0 and assume that s ∈ [tk , tk+1 ), for some k ∈ {0, . . . , n}.
Explanation: We only need to consider two cases: (1) i ≤ k in which case
Hi ∈ Fs in which case we can take Hi out of the conditional expectation and (2)
i ≥ k + 1 in which case s < ti and we may use tower property, to condition in
addition w.r.t Fti and take Hi out.
Then we have
n
X
E[It |Fs ] = E[Hi (Mti+1 ∧t − Mti ∧t )|Fs ]
i=1
k
X n
X
= E[Hi (Mti+1 ∧t − Mti ∧t )|Fs ] + E[Hi (Mti+1 ∧t − Mti ∧t )|Fs ] .
i=1 i=k+1

For i ≤ k, we have ti ≤ s, and do the random variable Hi ∈ Fs and


k
X k
X
E[Hi (Mti+1 − Mti )|Fs ] = Hi E[(Mti+1 − Mti )|Fs ]
i=1 i=1
k
X ∞
X
= Hi (Mti+1 ∧s − Mti ∧s ) = Hi (Mti+1 ∧s − Mti ∧s ) = Is .
i=1 i=1

If i ≥ k + 1 then s ≤ ti , we may use the tower property of the conditional


expectation:
n
X n
X    
E[Hi (Mti+1 ∧t − Mti ∧ t)|Fs ] = E E Hi (Mti+1 ∧t − Mti ∧t )|Fti |Fs
i=k+1 i=k+1
Xn
= E[Hi E[(Mti+1 ∧t − Mti ∧t )|Fti ]|Fs ]
i=k+1
 
0
n
X z }| {
= E Hi Mti+1 ∧t∧ti − Mti ∧t∧ti Fs  = 0.
 
i=k+1

Combing all these equalities we conclude

E[It |Fs ] = Is ,

and (It ) is a martingale. 


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 15

4 Stopping times, progressively measurability


The filtration Ft satisfies the usual conditions.

Exercise 18 Let S, T be stopping times. Prove that

(1) S ∧ T , S ∨ T , aS where a > 1, are stopping times.

(2) If T is stopping time, then there exists a sequence of stopping times Tn such
that Tn takes only a finite number of values and Tn decreases to T .

Solution: (1) For any t ≥ 0 we have

{S ∧ T ≤ t} = {S ≤ t} ∪ {T ≤ t} ∈ Ft ,
{S ∨ T ≤ t} = {S ≤ t} ∩ {T ≤ t} ∈ Ft ,
{aS ≤ t} = {S ≤ t/a} ∈ Ft/a ⊂ Ft ,

which proves the claim.


(2) For any n ∈ N we define the stopping time
 −n
i2 , if (i − 1)2−n ≤ T < i2−n for some i < n2−n ,
Tn =
+∞, if T ≥ n .

These stopping times satisfy the required conditions. 

Exercise 19 Prove that T is a stopping time iff {T < t} ∈ Ft , for any t > 0.
Solution: If T is a stopping time, then {T < t} = ∪n≥1 {T ≤ t − n1 } ∈ Ft ,
because {T ≤ t − n1 } ∈ Ft−1/n ⊂ Ft .
Conversely, if {T < t} ∈ Ft , for any t > 0, then
1
{T ≤ t} = ∩n≥1 {T < t + } ∈ Ft+ = Ft ,
n
because the filtration is right-continuous. 

Exercise 20 Let (Mt , t ∈ I) be an (Ft )-martingale. Let τ be a bounded stopping


time that takes countably many values. Let Y be a bounded Fτ measurable random
variable. Let Nt = Y (Mt − Mt∧τ ). Prove that (Nt ) is a martingale.
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 16

Solution: Since Y is bounded, Nt is an integrable process. Furthermore, it is


adapted, since, for any t ≥ 0 and any Borel set B, we have

{Nt ∈ B} = ({Y (Mt − Mt∧τ ) ∈ B} ∩ {τ ≤ t}) ∪ ({0 ∈ B} ∩ {τ > t}) ∈ Ft .

Let us now take any 0 ≤ s < t. Then we have

E[Nt |Fs ] = E[Y (Mt − Mt∧τ )1τ >s |Fs ] + E[Y (Mt − Mt∧τ )1τ ≤s |Fs ]
= I1 + I2 .

For the first term we have, by the optional stopping theorem,


0
z }| {
I1 = E[E[Y (Mt − Mt∧τ )1τ >s |Fτ ]|Fs ] = E[Y E[Mt − Mt∧τ |Fτ ] 1τ >s |Fs ] = 0.

For the second term we can get, again by the optional stopping theorem,

I2 = E[Y (Mt − Mτ )1τ ≤s |Fs ] = Y 1τ ≤s E[Mt − Mτ |Fs ]


= 1τ ≤s Y (Ms − Mτ ∧s ) = Ns .

This finishes the proof. 

Exercise 21 Show that for s < t and A ∈ Fs , τ = s1A + t1Ac is a stopping time.
Solution: Indeed, for any r ≥ 0 we have

 ∅, if r < s
{τ ≤ r} = A, if s ≤ r < t ∈ Fr .
Ω, if r ≥ t

Exercise 22 Let 0 = t1 < · · · < tn+1 < . . . with limn→∞ tn = ∞. For each
i = 0, 1, . . . , let Hi be a real valued Fti measurable random variable and H0 an
F0 -measurable random variable. For t > 0, we define

X
Xt (ω) = H0 (ω)1{0} (t) + Hi (ω)1(ti ,ti+1 ] (t).
i=0

Prove that (Xt , t ≥ 0) is progressively measurable.


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 17

Solution: To this end, we take any Borel set A ∈ B(R) and we have to show that
for any t ≥ 0, the set {(s, ω) : H(s, ω) ∈ A} is measurable in the product space
([0, t], B([0, t])) × (Ω, Ft ). We can rewrite this set in the following way:

{(s, ω) ∈ [0, t] × Ω : H(s, ω) ∈ A} = ({0} × {ω : H0 (ω) ∈ A})


∪ti ≤t ((ti ∧ t, ti+1 ∧ t] × {ω : Hi (ω) ∈ A}) .

Since the sets {0} and (ti , ti+1 ], i = 1, . . . , n, belong to B([0, t]), and {ω :
Hi (ω) ∈ A} ∈ Fti ⊂ F, the claim now follows from the fact that the product
of two measurable sets is measurable in the product space. 

Exercise 23 Let s R< t ≤ u < vR and let (HsR) and (Ks ) be two elementary pro-
t t s
cesses. We define: s Hr dBr = 0 Hr dBr − 0 Hr dBr . Prove that
Z t Z v 
E Hr dBr Kr dBr = 0.
s u
Rt
Solution: Recall that the stochastic process ( 0 Hr dBr , t ≥ 0) is measurable w.r.t.
Ft and is a martingale. We use the tower property to obtain the following:
Z t Z v   Z t Z v 
Hr dBr Kr dBr = E E Hr dBr Kr dBr Fu

E
s u s u
Z t Z v 
=E Hr dBr E Kr dBr Fu = 0 ,

s u

because the stochastic integral is a martingale, and hence the inner expectation
vanishes. 

Exercise 24 Let f : R+ → R be a differentiable function with f 0 ∈ L1 ([0, 1]).


(n) (n) (n)
Let ∆n : 0 = t1 < t2 < · · · < tNn = t be a sequence of partitions of [0, t] with
limn→∞ |∆n | → 0. Prove that
Nn  2
(n) (n)
X
lim f (tj ) − f (tj−1 ) =0.
n→∞
j=1
Rt
Hint: f is uniformly continuous on [0, t] and f (t) − f (s) = s f 0 (r) dr.
Solution: We have a simple estimate
Nn  2 XNn
(n) (n) (n) (n) (n) (n)
X
f (tj ) − f (tj−1 ) ≤ max f (tj ) − f (tj−1 ) f (tj ) − f (tj−1 ) .
j
j=1 j=1
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 18


(n) (n)
Firstly, we have maxj f (tj ) − f (tj−1 ) → 0, as n → ∞, because of the uniform
continuity of f on [0, t]. Secondly, we can estimate
Nn Z t(n) Nn Z t(n)
Nn

j j
X X
X (n) (n) 0
0
f (tj ) − f (tj−1 ) = (n) f (r) dr ≤ f (r) dr

tj−1 (n)
j=1 j=1
t
j=1 j−1
Z t
0
= f (r) dr < ∞ ,
0

which follows from the properties of f . Thus, from these facts the claim follows.

Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 19

5 Martingales and Optional Stopping Theorem


Let (Ft ) be a filtration satisfying the usual assumptions, unless otherwise stated..

Exercise 25 Use the super-martingale convergence theorem to prove the following


statement. Let (Xn ) be a sub-martingale sequence and supn E(Xn+ ) < ∞. Then
limn→∞ Xn exists almost surely.
Solution: The process Yn = −Xn is a super-martingale. Furthermore, supn E(Yn− ) =
supn E(Xn+ ) < ∞. Thus, from the super-martingale convergence theorem, the
limit limn→∞ Yn = − limn→∞ Xn exists almost surely. 

Exercise 26 Let (Mt , 0 ≤ t ≤ t0 ) be a continuous local martingale with M0 = 0.


Prove that (Mt , 0 ≤ t ≤ t0 ) is a martingale if supt≤t0 Mt ∈ L1 .
Solution: Since M0 = 0, there is a sequence of stopping times Tn ↑ ∞ almost
surely, such that (MtTn ) is a martingale. Hence, for any t > s, we have

E[MtTn |Fs ] = MsTn → Ms ,

almost surely, as n → ∞. Observe that MtTn ≤ supt≤t0 Mt , the condition that


supt≤t0 Mt ∈ L1 allows us to apply the dominated convergence theorem to obtain

E[MtTn |Fs ] → E[Mt |Fs ] ,

almost surely, as n → ∞. This shows that E[Mt |Fs ] = Ms , what means that (Mt )
is a martingale. 

Exercise 27 Let {Bt }t≥0 be one dimensional Brownian Motion starting at x (B0 =
x), and let a < x < b. In this question we are going to find the probability of the
Brownian Motion hitting b before a using the Optional Stopping Theorem (OST).
Set

Ta = inf {t : Bt = a}, Tb = inf {t : Bt = b}, and T = Ta ∧ Tb .

(a) Give an easy arguments why Ta , Tb and T are all stopping times with respect
to the natural filtration of the Brownian Motion.

(b) One would like to compute E[BT ] using OST, but (Bt , t ≥ 0) is not a uni-
formly integrable martingale and apply OST would require for T to be a
bounded stopping time. Instead we are using the limiting argument.

(b1) Let n ∈ N use OST to prove that E[BT ∧n ] = x.


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 20

(b2) Conclude that E[BT ] = x.


(c) Compute P(Ta > Tb ).
Solution: (a) Ta is a stopping time since it is the hitting time by (Bt ) of the closed
set {a} and the same can be said for Tb . That T = Ta ∧ Tb is the stopping time by
Exercise 18.
(b1) Note that 0 ≤ T ∧ n and both 0 and T ∧ n are bounded stopping times. Hence
by the OST E[BT ∧n |B0 ] = B0 = x almost surely. Taking the expectation on the
both parts and using the tower law we get the result.
(b2) It is clear that pointwise BT ∧n → BT as n → ∞. Moreover BT ∧n is bounded
between a and b hence we can use DCT to conclude:

E[BT ] = lim E[BT ∧n ] = x


n→∞

(c) Denote P(Ta > Tb ) by p, then:


x = E[BT ] = E[BTb 1{Ta >Tb } + BTa 1{Tb >Ta } ] = E[b1{Ta >Tb } + a1{Tb >Ta } ]
= bp + a(1 − p)
x−a
From here we deduce that P(Ta > Tb ) = b−a . 

Exercise 28 If (Mt , t ∈ I) is a martingale and S and T are two stopping times


with T bounded, prove that

MS∧T = E{MT |FS }.

Solution: We can write

E{MT |FS } = E{1T <S MT |FS } + E{1T ≥S MT |FS } .

Note, that 1T <S MT is FS -measurable. Thus, for the first term we have

E{1T <S MT |FS } = 1T <S MT = 1T <S MS∧T .

One can see that {T ≥ S} ∈ FS∧T . Indeed, for any t ≥ 0 one has

{T ≥ S} ∩ {T ∧ S ≤ t} = {S ≤ T ∧ t} = {S ≤ T ≤ t} ∪ ({S ≤ t} ∩ {T > t}) ∈ Ft ,

because every set belongs to Ft . Thus, we conclude

E{1T ≥S MT |FS } = E{1T ≥S MT |FS∧T } = 1T ≥S E{MT |FS∧T } = 1T ≥S MS∧T ,

where we have used the Doob’s optional stopping theorem. Combining all these
equalities together, we obtain the claim. 
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 21

Exercise 29 Prove the following: (1) A positive right continuous local martingale
(Mt , t ≥ 0) with M0 = 1 is a super-martingale. (2) A positive right continuous
local martingale (Mt , t ≥ 0) is a martingale if E|M0 | < ∞ and EMt = EM0 for
all t > 0.
Solution: (1) Since E|M0 | < ∞, there is a sequence of increasing stopping times
Tn , such that the process MtTn is a martingale. In particular, MtTn is integrable.
For s < t, we use Fatou’s lemma to obtain

E[Mt |Fs ] = E[lim inf Mt∧Tn |Fs ] ≤ lim inf E[Mt∧Tn |Fs ] = lim inf Ms∧Tn = Ms .
n→∞ n→∞ n→∞

(2) Let S ≤ T be two stopping times, bonded by a constant K. Then, from (1), we
have

EM0 ≥ EMS ≥ EMT ≥ EMK = EM0 ,

which implies EMT = EMS . We know that EMT = EMS for any two bounded
stopping times t ≤ T implies that (Mt ) is a martingale, completing the proof. 

Exercise 30 Let (Mt , t ≥ 0) and (Nt , t ≥ 0) be continuous local martingales with


M0 = 0 and N0 = 0.

(1) Let (At ) and (A0t ) be two continuous stochastic processes of finite variation
with initial values 0 and such that (Mt Nt − At ) and (Mt Nt − A0t ) are local
martingales. Prove that (At ) and (A0t ) are indistinguishable.

(2) Prove that hM, N it is symmetric in (Mt ) and (Nt ) and is bilinear.
 
1
(3) Prove that hM, N it = 4 hM + N, M + N it − hM − N, M − N it .

(4) Prove that hM − M0 , N − N0 it = hM, N it .

(5) Let T be a stopping time, prove that

hM T , N T i = hM, N iT = hM, N T i.

Solution: (1) We use the following theorem. If (Mt , 0 ≤ t ≤ T ) is a continuous


local martingale with M0 = 0. Suppose that (Mt , 0 ≤ t ≤ T ) has finite total
variation. Then Mt = M0 , any 0 ≤ t ≤ T .
The process A0t − At = (Mt Nt − At ) − (Mt Nt − A0t ) is a continuous local
martingale, and at the same time a process of finite total variation. Then A0t − At =
A00 − A0 = 0.
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 22

(2) The symmetry equality hM, N i = hN, M i comes from that of the product:
M N = N M . Next, we will prove

hM1 + M2 , N i = hM1 , N i + hM2 , N i .

The process
M1 N + M2 N − hM1 , N i − hM2 , N i
is a local martingale. Thus, the claim follows from the uniqueness of the bracket
process. Similarly for k ∈ R, kM − khM, i is a local martingale and hkM i =
khM i.
(3) is a consequence of the bi-linearity, proved earlier.
(4) Since (Mt N0 , t ≥ 0) is a local martingale, the bracket process hM, N0 i van-
ishes. By the same reason we have hM0 , N i = hM0 , N0 i = 0. Hence, the claim
follows from the bilinearity.
(5) By the definition M T N T − hM T , N T i and (M N )T − hM, N iT are local
martingales. This implies hM T , N T i = hM, N iT , from the uniqueness of the
bracket process.
Furthermore, both M T N T − hM T , N T i and (M − M T )N T are local mar-
tingales, hence their sum M N T − hM T , N T i is a local martingale as well. This
implies hM, N T i = hM T , N T i, from the uniqueness of the bracket process. 

Exercise 31 Let (Mt , t ∈ [0, 1]) and (Nt , t ∈ [0, 1]) be bounded continuous mar-
tingales with M0 = N0 = 0. If (Mt ) and (Nt ) are furthermore independent prove
that their quadratic variation process (hM, N it ) vanishes.
Solution: Let us take a partition 0 = t0 < t1 < . . . < tn = 1. Then we have
n
hX i2
E (Mti − Mti−1 )(Nti − Nti−1 )
i=1
n
X h i h i
= E (Mti − Mti−1 )(Mtj − Mtj−1 ) E (Nti − Nti−1 )(Ntj − Ntj−1 )
i,j=1
X n h i2 h n
i2 X h i h i
= E Mti − Mti−1 E Nti − Nti−1 = E Mt2i − Mt2i−1 E Nt2i − Nt2i−1
i=1 i=1
n
X h i h i Xn h i h i
≤ E Mt2i − Mt2i−1 sup E Nt2j − Nt2j−1 ≤ E Mt2i − Mt2i−1 E sup Nt2j − Nt2j−1 .
i=1 j i=1 j

Because Nt2j is uniformly continuous, supj |Nt2j − Nt2j−1 | → 0 almost surely and
h i
Nt2 is bounded, E supj Nt2j − Nt2j−1 → 0. Since the bracket process is the limit
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 23

Pn
of i=1 (Mti − Mti−1 )(Nti − Nti−1 ) (in probability), this implies hM, N it = 0.


Exercise 32 Prove that (1) for almost surely all ω, the Brownian paths t 7→ Bt (ω)
has infinite total variation on any intervals [a, b]. (2) For almost surely all ω, Bt (ω)
cannot have Hölder continuous path of order α > 21 .
Solution: (1) We know that t is the quadratic variation of (Bt ). We can choose a
(n)
sequence of partitions {tj }j=1,...,Mn such that for almost surely all ω,
n 
X 2
Bt(n) (ω) − Bt(n) (ω) → (b − a) , (4)
j j−1
j=1

as n → ∞. Also for every ω,


Mn
X Mn
X
(Bt(n) (ω) − Bt(n) (ω))2 ≤ max |Bt(n) (ω) − Bt(n) (ω)| |Bt(n) (ω) − Bt(n) (ω)|
j j−1 j j j−1 j j−1
j=1 j=1

≤ max |Bt(n) (ω) − Bt(n) (ω)||B(ω)|TV[a,b] .


j j j−1

If |B(ω)|TV[a,b] is finite then, since t 7→ Bt (ω) is uniformly continuous on [a, b],

Mn
X
(Bt(n) (ω) − Bt(n) (ω))2 ≤ max |Bt(n) (ω) − Bt(n) (ω)|TV[a,b] (B(ω)) → 0 .
j j−1 j j j−1
j=1

But for almost surely all ω, this limit is b − a, and hence TV[a,b] (B(ω)) = ∞ for
almost surely all ω.
(2) Suppose that for some α > 12 and some number C, both may depend on ω,

|Bt (ω) − Bs (ω)| ≤ C(ω)|t − s|α .


(n)
We take the partitions {tj }j=1,...,Mn , as in (1). For any ω we have

Mn Mn Mn
(n) (n) (n) (n)
X X X
(Bt(n) (ω) − Bt(n) (ω))2 ≤ C 2 |tj − tj−1 |2α ≤ C 2 |∆n |2α−1 |tj − tj−1 |
j j−1
j=1 j=1 j=1
2 2α−1
= C (b − a)|∆n | →0,

what contradicts with (4), unless b − a = 0. Hence for almost surely all ω, the
Brownian path is not Hölder continuous of order α > 1/2 in any interval [a, b]. 
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 24

6 Local Martingales and Stochastic Integration


If M ∈ H 2 , L2 (M ) denotes the L2 space of progressively measurable processes
with the following L2 norm:
Z t 
2
|f |L2 (M ) := E (fs ) dhM, M is ) < ∞.
0

If (Mt ) is a continuous local martingale we denote by L2loc (M ) the space of pro-


gressively measurable processes with
Z t
(Ks )2 dhM, M is < ∞, ∀t.
0

Exercise 33 Let (Bt ) be a standard Brownian Motion and let fs , gs ∈ L2loc (B).
Compute the indicated bracket processes, the final expression should not involve
stochastic integration.
Rt Rt
1. h 0 fs dBs , 0 gs dBs i.
R· R·
2. h 0 Bs dBs , 0 Bs3 dBs it .
Solution: (1a) By the definition,
Z · Z · Z t Z t
h fs dBs , gs dBs it = fs gs dhBis = fs gs ds .
0 0 0 0

(1b) By Itô isometry:


Z · Z · Z t Z t
h Bs dBs , Bs3 dBs it = Bs4 dhBis = Bs4 ds .
0 0 0 0

(2) We have the following identities:


Z t Z s  Z t
It = (2Bs + 1)d Br d(Br + r) = (2Bs + 1)Bs d(Bs + s)
0 0 0
Z t Z t Z t Z t
2 2
=2 Bs dBs + Bs dBs + 2 Bs ds + Bs ds .
0 0 0 0

Exercise 34 We say that a stochastic process belongs to C α− if its sample paths


are of locally Hölder continuous of order γ for every γ < α. Let (Ht , t ≤ 1)
be an adapted continuous and Lp bounded stochastic process where p > 2, with
H ∈ L 2
R t ([0, 1] × Ω). Let (Bt ) be a one dimensional Brownian motion and set
Mt = 0 Hs dBs .
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 25

1
− p1 −
(a) Prove that (Mt ) belongs to C 2 .

(b) If (Ht ) is a bounded process, i.e. sup(t,ω) |Ht (ω)| < ∞, prove that (Mt )
1
belongs to C 2 − .
Rt Rt
Proof: First note that the bracket process of s Hr dBr is s (Hr )2 dr.
(1) Since H is Lp bounded, there exists C s.t. sups∈[0,1] E(|Hs |p ) < C. By
Burkholder-Davis-Gundy inequality,
Z t  p2 Z t
p 2 1 p
E|Mt − Ms | ≤ cp E (Hr ) dr ≤ cp (t − s) E 2 (Hr )p dr
s t−s s
p
≤ Ccp (t − s) 2 .

By Kolmogorov’s continuity criterion, (Mt ) has a modification which is Hölder


continuous for any γ < p1 ( p2 − 1) = 12 − p1 .
(2) Similarly for any p ≥ 0,
Z t  p2
p
p 2
E|Mt − Ms | ≤ cp E (Hr ) dr ≤ cp (t − s) 2 sup (Hr (ω))p ,
s r∈[0,1],ω∈Ω

so that (Mt ) has a modification which is Hölder continuous for any γ < 21 − p1 for
any p > 0 and concluding the second assertion, again by Kolmogorov’s continuity
theorem. 

Exercise
RT 35 Let T > 0. Let f beR t a left continuous and adapted process such that
E 0 (fs )2 ds < ∞. Prove that ( 0 fs dBs , 0 ≤ t ≤ T ) is a martingale.
Rt
Solution: We know that ( 0 fs dBs , 0 ≤ t ≤ T ) is a local martingale. Furthermore,
Z t Z t Z T
2 2
E( fs dBs , ) ≤ c2 E (fs ) ds ≤ c2 E (fs )2 ds < ∞.
0 0 0

by Burkholder-Davis-Gundy inequality. It is therefore an L2 bounded process, and


so a martingale.


Exercise 36 Let M ∈ H 2 and H ∈ L2 (M ). Let τ be a stopping time. Prove that


Z t∧τ Z t Z t
Hs dMs = 1s≤τ Hs dMs = Hs dMsτ .
0 0 0
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 26

Solution: For any N ∈ H 2 , we use Exercise 30 to obtain the following identities


Z t Z t
τ
Hs dhM , N is = Hs dhM, N iτ ∧s
0 0
Z t Z τ ∧t
= 1s≤τ Hs dhM, N is = Hs dhM, N is .
0 0
Rt
The claim now follows from the fact, that the stochastic integral It = 0 Hs dMsτ
is defined uniquely by the identities
Z t
hI, N it = Hs dhM τ , N is , t ≥ 0 ,
0

for any N ∈ H 2. 

Exercise
Rt 37 Let M ∈ H 2 and K ∈ E. Prove that the elementary integral It :=
0 Ks dMs satisfies
Z t
hI, N it = Ks dhM, N is , ∀t ≥ 0 ,
0

for any N ∈ H 2.
Solution: See lecture notes. 

Exercise 38 Let Bt = (Bt1 , . . . , Btn ) be an n-dimensional Brownian motion. Prove


that
X n Z t
2
|Bt | = 2 Bsi dBsi + nt.
i=1 0
Pn
Solution: Either use |Bt |2 = i 2
i=1 |Bt | apply the product to each component
i
(Bt ), then summing up or use the multi-dimensional version of the Itô’s formula.
Let us consider the function f : Rn → R, f : x 7→ |x|2 . Then its derivatives
are given by
2
∂i f (x) = 2xi , ∂ij f (x) = 2δi,j .
Applying the Itô formula, we obtain
n Z t n Z
X 1 X t 2
|Bt |2 = f (Bt ) = f (B0 ) + ∂i f (Bs )dBsi + ∂ij f (Bs )d < B i , B j >s
2
i=1 0 i,j=1 0
Xn Z t X n Z t n Z t
X
i i
=0+2 Bs dBs + ds = 2 Bsi dBsi + nt ,
i=1 0 i=1 0 i=1 0

which is the required identity. 


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 27

Rt
Exercise 39 Give an expression for 0 s dBs that does not involve stochastic inte-
gration.
Solution: Applying the classical integration by parts formula, we obtain
Z t Z t
s dBs = tBt − Bs ds.
0 0

Rt Rs 
Exercise 40 Write 0 (2Bs + 1)d 0 Br d(Br + r) as a function of (Bt ), not in-
volving stochastic integrals.
Solution: Firstly,
Z t Z s  Z t
(2Bs + 1)d Br d(Br + r) = (2Bs + 1)Bs d(Bs + s)
0 0 0
Z t Z t Z t
2
= 2Bs dBs + Bs dBs + (2Bs2 + Bs )ds.
0 0 0

From the Itô formula for the function x3 we obtain


Z t Z t
1
Bs2 dBs = Bt3 − Bs ds .
0 3 0

And the Itô formula for the function x2 gives


Z t
1
Bs dBs = (Bt2 − t) .
0 2
Substituting these stochastic integrals into the formula obtained in (2a), we get
Z t Z t
2 3 1 2 2
It = Bt + (Bt − t) + 2 Bs ds − Bs ds .
3 2 0 0

Exercise 41 If (Ht ) and (Kt ) are continuous martingales, prove that


Z t Z t
hHK, Bit = Hr dhK, Bir + Kr dhH, Bir .
0 0
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 28

Solution: The product formula gives


Z t Z t
Ht Kt = Hr dKr + Kr dHr + H0 K0 + hH, Kit .
0 0

Hence, by Itô isometry, we obtain


Z · Z ·
hHK, Bit =< Hr dKr + Kr dHr , Bit
0 0
Z t Z t
= Hr dhK, Bir + Kr dhH, Bir ,
0 0

what is the required identity. 


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 29

7 Itô’s Formula
Exercise 42 If (Nt ) is a continuous local martingale with  N0 = 0, show that
Nt − 21 hN,N it Nt − 21 hN,N it
(e ) is a local martingale, and E e ≤ 1.
1
Solution: Let us denote Xt = eNt − 2 hN,N it . The process (Nt − 12 hN, N it ) is a
semi-martingale, and we can apply the Itô formula to the function ex :
Z t
Xt = 1 + Xs dNs .
0

Since the stochastic integral is a local martingale, we conclude that (Xt ) is a local
martingale.
Moreover, let Tn be the sequence of increasing stopping times such that XtTn
is a uniformly integrable martingale. Then, applying the Fatou lemma, we get

E[Xt ] = E[ lim XtTn ] ≤ lim inf E[XtTn ] = E[X0 ] = 1 ,


n→∞ n→∞

which is the required bound. 

Exercise 43 Show that a positive continuous local martingale (Nt ) with N0 =


1 can be written in the form of Nt = exp (Mt − 12 hM, M it ) where (Mt ) is a
continuous local martingale.
Rt
Solution: Let us define Mt = 0 Ns−1 dNs . Applying the Itô formula to the func-
tion log(x), we obtain
Z t Z t
1 1
log(Nt ) = log(N0 ) + Ns−1 dNs − Ns−2 dhN is = Mt − hM it ,
0 2 0 2
which implies the claim. 

Exercise 44 Let (Bt ) be a one dimensional Brownian motion on (Ω, F, Ft , P )


and let g : R → R be a bounded Borel measurable function. Find a solution to
Z t
xt = x0 + xs g(Bs )dBs .
0
Rt
Solution: We define Mt = 0 g(Bs )dBs . Then dXt = Xt dMt whose solution is
the exponential martingale:
1
eMt − 2 hM it .
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 30

Let f (x) = x and yt = Mt − 12 hM it . We see that


Z t Z t Z t
yt ys 1 ys
e =1+ e dys + e dhM is = 1 + eys dMs ,
0 2 0 0

proving the claim. 

Exercise 45 1. Let f, g : R → R be C 2 functions, prove that


Z t
hf (B), g(B)it = f 0 (Bs )g 0 (Bs )ds.
0

2. Compute hexp (Mt − 21 hM it ), exp (Nt − 21 hN it )i, where (Mt ) and (Nt )
are continuous local martingales.
Solution: (1) By Itô formula we have
Z t
1 t 00
Z
0
f (Bt ) = f (0) + f (Bs )dBs + f (Bs )ds ,
0 2 0
Z t
1 t 00
Z
0
g(Bt ) = g(0) + g (Bs )dBs + g (Bs )ds .
0 2 0
Hence, we obtain
Z · Z ·
hf (B), g(B)it = h f 0 (Bs )dBs , g 0 (Bs )dBs it
0 0
Z t
= f 0 (Bs )g 0 (Bs )ds ,
0

what is the required identity.


(2) Let us denote Xt = exp (Mt − 12 hM it ) and Yt = exp (Nt − 21 hN it ). Then,
applying the Itô formula to the functions ex , we obtain
Z t Z t
Xt = exp (M0 ) + Xs dMs , Yt = exp (N0 ) + Ys dNs .
0 0

Hence, we conclude
Z · Z · Z t
hX, Y it = h Xs dMs , Ys dNs it = Xs Ys dhM, N is .
0 0 0


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 31

Exercise 46 Let Bt = (Bt1 , . . . , Btn ) be an n-dimensional Brownian motion on


(Ω, F, Ft , P ), with n > 1. We know that for any t > 0, {rs 6= 0 for all s ≤ t} has
probability one. Prove that the process rt = |Bt | is a solution of the equation
t
n−1
Z
rt = r0 + Bt + ds , r0 = 0.
0 2rs

Solution: (1) Let us take a function f (x) = |x|, for x ∈ Rn . Its derivatives are
given by

∂f xi ∂2f |x|2 − x2i


f (x) = , (x) = ,
∂xi |x| ∂x2i |x|3

for x 6= 0. Note that f is C 2 on R2 \ {0}. Let τ be the first time that Xt := x0 + Bt


hits 0. Then τ = ∞ almost surely. Applying the Itô formula to f and Xtτ , the latter
equals Xs almost surely. We obtain
n Z t n Z
X Xsi i 1 X t |Xs |2 − (Xsi )2
rt = r0 + dB + ds
|Xs | s 2 |Xs |3
i=1 0 i=1 0
n Z t
Xsi 1 t n|Xs |2 − |Xs |2
X Z
i
= r0 + dB + ds.
0 |Xs | s 2 0 |Xs |3
i=1

P R t Xsi
To finish the proof, we have to show that B̃t := ni=1 0 |X s|
dBsi is a Brownian
motion. Firstly it is a continuous martingale, starting at 0. It has the quadratic
variation
n Z t
X (Bsi )2
hB̃, B̃it = 2
ds = t.
0 |Bs | i=1

Hence, the claim follows from the Lévy characterization theorem. 

Exercise 47 Let f : R → R be C 2 . Suppose that there is a constant C > 0 s.t.


f ≥ C. Let Z x
1
h(x) = dy.
0 f (y)
−1
 x→+∞ h(x)= ∞. Denote by h : [0, ∞) → [0, ∞) its inverse..
Suppose that lim
Prove that h−1 h(x0 ) + Bt satisfies:
Z t Z t
1
xt = x0 + f (xs )dBs + f (xs )f 0 (xs )ds.
0 2 0
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 32

 
Solution: Let us denote g(x) = h−1 h(x0 ) + x . Then its derivatives are given
by
g 0 (x) = f (g(x)) , g 00 (x) = f 0 (g(x))f (g(x)) .
The claim now follows by applying the Itô formula to g(Bt ). 

Exercise 48 Let (Bt ) be a Brownian motion and τ its hitting time of the set [2, ∞).
Is the stochastic process ( Bt1−2 , t < τ ) a solution to a stochastic differential equa-
tion? (The time τ is the natural life time of the process Bt1−2 .)
Solution: Note that τ = inf{t : Bt ∈ [2, ∞)}. Let τn be a sequence of stopping
time increasing to τ . Let f (x) = (x − 2)−1 . Where f is differentiable, f 0 (x) =
−(x − 2)−2 = −f 2 and f 00 (x) = 2(x − 2)−3 = 2f 3 . Applying the Itô formula to
f and to the stopped process Xt := (Bt − 2)−1 we obtain
Z t Z t
τn τn 1 τn 2
Xt ≡ f (Bt ) = − − (Xs ) dBs + (Xsτn )3 ds ,
2 0 0

i.e. Z t∧τn Z t∧τn


1
Xtτn =− − 2
(Xs ) dBs + (Xs )3 ds ,
2 0 0
We see that on the set {t < τn },
Z t Z t
1
Xt = − − Xs2 dBs + Xs3 ds .
2 0 0
which is the required equation. Since {t < τ } = ∪n {t < τn }, the equation holds
on {t < τ }. 

R t (Xt ) be a continuous local martingale begins with 0. Prove that


Exercise 49 Let
2
hX , Xit = 2 0 Xs dhX, Xis .
Rt
Solution: Note that Xt = 0 dXs and
Z t
2
Xt = 2 Xs Xs + hX, Xit .
0

Consequently,
Z · Z · Z t
hX 2 , Xit = h dXs , 2 Xs Xs it = 2 Xs dhX, Xis .
0 0 0


Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 33

8 Lévy’s Characterisation Theorem, SDEs


Exercise 50 Let σk = (σk1 , . . . , σkd ), b = (b1 , . . . , bd ) be C 2 functions from Rd to
Rd . Let
d m d
!
1 X X i j ∂2 X ∂
L= σk σk + bl .
2 ∂xi ∂xj ∂xl
i,j=1 k=1 l=1

Suppose that for C > 0,

|σ(x)|2 ≤ c(1 + |x|2 ), hb(x), xi ≤ c(1 + |x|2 ).

(a) Let f (x) = |x|2 + 1. Prove that Lf ≤ af for some a.

(b) If xt = (x1t , . . . , xdt ) is a stochastic process with values in Rd , satisfying the


following relations:
m Z
X t Z t
xit = x0 + σki (xs )dBsk + bi (xs )ds,
k=1 0 0

Rt
and f : Rd → R, prove that f (xt ) − f (x0 ) − 0 Lf (xs )ds is a local mar-
tingale and give the semi-martingale decomposition for f (xt ).

Solution: (a) The partial derivatives of f are given by

∂f ∂2f
(x) = 2xi , (x) = 2δi,j .
∂xi ∂xi ∂xj

Hence, we obtain the following bound


d X
X m d
X
i 2
Lf (x) = (σk ) + 2 bi xi = |σ(x)|2 + 2 < b(x), x >
i=1 k=1 i=1
≤ cf (x) + 2cf (x) = 3cf (x) ,

which is the required bond.


(b) Firstly, hB i , B j is = t if i = j and vanishes otherwise. Hence,
*m Z m Z ·
+ m Z t
X · j
σki (xs )σkj (xs )ds.
X X
i j i k l
hx , x it = σk (xs )dBs , σl (xs )dBs =
k=1 0 l=1 0 k=1 0
t
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 34

d Z t d Z
X ∂f i 1 X t ∂2f
f (xt ) =f (x0 ) + (xs )dxs + (xs )dhxi , xj is
0 ∂xi 2 0 ∂xi ∂xj
i=1 i,j=1
d Z t m d Z t
X ∂f X X ∂f
=f (x0 ) + (xs ) σki (xs ) dBsk + (xs )bi (xs )ds
0 ∂xi 0 ∂xi
i=1 k=1 i=1
d Z m
1 X t ∂2f
σki (xs )σkj (xs )ds
X
+ (xs )
2 0 ∂xi ∂xj
i,j=1 k=1
m Z tX
d
X ∂f
=f (x0 ) + (xs )σki (xs ) dBsk
0 ∂xi
k=1 i=1
 
d m d
Z t !
2
1 σki (xs )σkj (xs )
∂ f ∂f
X X X
+ (xs ) + (xs )bi (xs ) ds
0 2 ∂xi ∂xj ∂xi
i,j=1 k=1 i=1
Rt Pm R t Pd
∂f
Thus f (xt ) − f (x0 ) − 0 Lf (xs )ds = k=1 0 i=1 ∂x i
(xs )σki (xs ) dBsk is a
local martingale and the semi-martingale decomposition is as given in the identity
earlier. 

Exercise 51 Let T be a bounded stopping time and (Bt ) a one dimensional Brow-
nian motion. Prove that (BT +s − BT , s ≥ 0) is a Brownian motion.
Solution: Let us denote Ws = BT +s − BT . It is obvious, that W0 = 0 and Wt has
continuous sample paths. Moreover, by the OST we have, for any s < t,

E[Wt |FT +s ] = E[BT +t |FT +s ] − BT = BT +s − BT = Ws .

Let Gt = FT +t . Then (Wt ) is a (Gt )-martingale. Next, we take s < t and derive

E[Wt2 |FT +s ] = E[BT2 +t + BT2 − 2BT BT +t |FT +s ]


= E[BT2 +t − (T + t)|FT +s ] + (T − t) + BT2 − 2BT BT +s
= (BT2 +s − (T + s)) + (T − t) + BT2 − 2BT BT +s
= Ws2 + t − s ,

where we have used the fact that (BT +t )2 − (T + t) is a martingale. This im-
plies that Wt2 − t is a (Gt )-martingale, and hence hW, W it = t. Using the Lévy
characterization theorem, we conclude that Wt is a (Gt )-Brownian motion. 

Exercise 52 Let {Bt , Wt1 , Wt2 } be independent one dimensional Brownian mo-
tions.
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 35

1. Let (xs ) be an adapted continuous stochastic process. Prove that (Wt ) de-
fined below is a Brownian motion,
Z t Z t
Wt = cos(xs )dWs1 + sin(xs )dWs2 .
0 0

2. Let sgn(x) = 1 if x > 0 and sgn(x) = −1 if x ≤ 0. Prove that (Wt ) is a


Brownian motion if it satisfies the following relation
Z t
Wt = sgn(Ws )dBs .
0

3. Prove that the process (Xt , Yt ) is a Brownian motion if they satisfy the fol-
lowing relations,
Z t Z t
1
Xt = 1Xs >Ys dWs + 1Xs ≤Ys dWs2
0 0
Z t Z t
Yt = 1Xs ≤Ys dWs1 + 1Xs >Ys dWs2 .
0 0

Solution: (1) The process (Wt ) is a continuous martingale with the quadratic vari-
ation Z t
(cos(xs ))2 + (sin(xs ))2 ds = t .

< W >t =
0
Hence, by Lévy characterization theorem, we conclude that (Wt ) is a Brownian
motion.
(2) Can be shown in the same way.
(3) The processes (Xt ) and (Yt ) are martingales, and their quadratic variations are
Z t
12Xs >Ys + 12Xs ≤Ys ds = t ,

< X >t =
0
Z t
12Xs ≤Ys + 12Xs >Ys ds = t .

< Y >t =
0

The bracket process is equal to


Z t Z t
< X, Y >t = 1Xs >Ys 1Xs ≤Ys ds + 1Xs ≤Ys 1Xs >Ys ds = 0 ,
0 0

where we have used the fact that 1Xs >Ys 1Xs ≤Ys = 0. The claim now follows from
the Lévy characterization theorem. 
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 36

Exercise 53 Let t ∈ [0, 1). Define x0 = 0 and


Z t
1
xt = (1 − t) dBs .
0 1−s
(1) Prove that
Z t 2
dBs t
E = .
0 1−s 1−t

(2) Prove that Z · 


dBs t
= .
0 1−s t 1−t
Define A(t) = t
1−t . Then A−1 (r) = r
r+1 .

(3) Define
Z A−1 (r)
dBs
Wr = , 0 ≤ r < ∞.
0 1−s
Let Gr = FA−1 (r) . Prove that (Wr ) is an (Gr )-martingale.

(4) For a standard one dimensional Brownian motion Bt , limr→0 rB1/r = 0.


Use this to prove that limt→1 xt = 0.

(5) Prove that xt solves Z t


xs
xt = Bt − ds.
0 1−s
Rt xs
(6) Prove that 0 1−s ds is of finite total variation on [0, 1].
Hint: it is sufficient to prove that
Z 1
|xs |
ds < ∞
0 1−s
almost surely.

Solution: (1) By Itô isometry we obtain


Z t 2 Z t
dBs ds t
E = 2
= .
0 1−s 0 (1 − s) 1−t

(2) We have Z ·  Z t
dBs d < B >s t
= 2
= .
0 1−s t 0 (1 − s) 1−t
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 37

(3) It is obvious that (Wr ) is integrable and adapted to (Gr ). Moreover, for any
t > s, we have
"Z # Z
t/(t+1) s/(s+1)
dBr dBr
E[Wt |Gs ] = E Fs/(s+1) = = Ws ,
0 1−r 0 1−r

because s/(s + 1) < t/(t + 1). Thus the Itô integral (Wt ) is a martingale with
respect to (Ft ). This shows that (Wr ) is a martingale with respect to (Gr ).
(4) By (2), we can calculate

A−1 (t)
hW it = =t.
1 − A−1 (t)

By Paul Lévy characterization theorem, we conclude that (Wt ) is a Brownian mo-


tion. Then, limr→0 rW1/r = 0. This implies

lim Xt = lim Xr/(r+1) = lim rW1/r = 0 .


t→1 r→0 r→0

1
(5) Since the function 1−s has finite total variation on [0, c], for every c < 1,
Rt 1
the integral 0 Bs d 1−s , for t ≤ c, can be defined in the Stieltjes sense. Using
integration by parts, we see that the process Wt is defined in the Stieltjes sense as
well. We can use the classical analysis to derive the following equalities:
Z t Z t t Z t
xs
ds = WA(s) ds = sWA(s) − s dWA(s)

0 1−s 0 0 0
Z t Z t Z t Z t Z t
dBs s dBs dBs
=t − dBs = t − + dBs
0 1−s 0 1−s 0 1−s 0 1−s 0
= −xt + Bt ,

which is the claimed equality, for any t ≤ c. Passing c → 1, we conclude that the
equality holds for any t ∈ [0, 1].
(6) Since, xs is defined as an integral of a non-random process with respect to the
Brownian motion, it has the normal distribution, whose parameters are easily seen
to be 0 and s(1 − s) (the latter follows from (1)). Thus, we apply Fubini lemma to
derive
Z 1 Z 1 Z 1p
|xs | E|xs | s(1 − s)
E ds = ds = C ds < ∞ ,
0 1 − s 0 1 − s 0 1−s
R 1 |xs |
for some constant C > 0. This implies that 0 1−s ds < ∞ almost surely. Defining
R t xs
the process Yt = 0 1−s ds and taking any partition 0 = t0 < t1 < . . . < tn = 1,
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 38

we derive
n−1 n−1
X Z ti+1 n−1 Z t
X xs X i+1 |xs |
|Yti+1 − Yti | = ds ≤ ds

1−s 1−s

i=0 i=0 ti i=0 ti
Z 1
|xs |
= ds < ∞ ,
0 1 −s
from what the claim follows. 
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 39

Problem Sheet 9: SDE and Grisanov


Exercise 54 Let us consider an SDE dxt = m k d
P
k=1 σk (xt )dBt + σ0 (xt )dt on R
d
with infinitesimal generator L. Let D be a bounded open subset of R with closure
D̄ and C 2 boundary ∂D. Let x0 ∈ D. Suppose that there is a global solution to the
SDE with initial value x0 and denote by τ its first exit time from D. Suppose that
τ < ∞ almost surely. Let g : D → R be a continuous function, and u : D̄ → R
R τproblem Lu = −g on D and u = 0 on the boundary.
be a solution to the Dirichlet
Prove that u(x0 ) = E 0 g(xs )ds. [Hint: Use a sequence of increasing stop-
ping times τn with limit τ ].
Solution: Let Dn be a sequence of increasing open sets of Dn ⊂ Dn+1 ⊂ D and
τn the first exit time of xt from Dn . By Itô’s formula,
Z t∧τn Xn Z t∧τn
u(xt∧τn ) = u(x0 ) + Lu(xs )ds + du(σk (xs ))dBsk
0 k=1 0
Z t∧τn n
X Z t∧τn
= u(x0 ) − g(xs )ds + du(σk (xs ))dBsk .
0 k=1 0

Since u, g are continuous on D̄n , the stochastic integral is a martingale. We have


Z t∧τn
E[u(xt∧τn )] = u(x0 ) − E g(xs )ds.
0

Since τn < τ < ∞, limt↑∞ (t ∧ τn ) = t a.s. We take t → ∞ followed by


taking n → ∞ and used dominated convergence theorem to take the limit inside
the expectation, we see that
lim lim E[u(xt∧τn )] = Eu(xτ ) = 0.
n→∞ t→∞

and Z t∧τn Z τ
lim lim E g(xs )ds = E g(xs )ds.
n→∞ t→∞ 0 0
This completes the proof. 

Exercise 55 A sample continuous Markov process on R2 is a Markov process on


the hyperbolic space (the upper half space model) if its infinitesimal generator is
1 2 ∂2 ∂2
2 y ( ∂x2 + ∂y 2 ). Prove that the solution to the following equation is a hyperbolic
Brownian motion.
dxt = yt dBt1
dyt = yt dBt2 .
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 40

[Hint: Show that if y0 > 0 then yt is positive and hence the SDE can be considered
to be defined on the upper half plane. Compute its infinitesimal generator L.]
2
Solution: Just observe that yt = y0 eBt −(1/2)t so yt > 0 if y0 > 0. Compute the
generator by Itô’s formula. 

Exercise 56 Discuss the uniqueness and existence problem for the SDE
dxt = sin(xt )dBt1 + cos(xt )dBt2 .
Solution: The functions sin(x) and cos(x) are C 1 with bounded derivative, and
hence Lipschitz continuous. For each initial value there is a unique global strong
solution. 

Exercise 57 Let (xt ) and (yt ) be solutions to the following respective SDEs (in
Stratnovitch form),
Z t Z t
xt = x0 − ys ◦ dBs , yt = y0 + xs ◦ dBs .
0 0
Show that x2t + yt2 is independent of t.
Solution: Let us rewrite the SDEs in the Itô form:
Z t Z t
1 1
xt = x0 − ys dBs − < y, B >t , yt = y0 + xs dBs + < x, B >t .
0 2 0 2
We can calculate the bracket processes in these expressions:
Z · Z t Z t
< y, B >t =< xs dBs , B >t = xs ds , < x, B >t = ys ds .
0 0 0
Moreover, the quadratic variations of (xt ) and (yt ) are
Z · Z t Z t
2
< y >t =< xs dBs >t = xs ds , < x >t = ys2 ds .
0 0 0
Applying now the Itô formula to the function f (x, y) = + x2 y2,
we obtain
Z t Z t
f (xt , yt ) =f (x0 , y0 ) + 2 xs dxs + 2 ys dys + < x >t + < y >t
0 0
Z t Z t Z t
=f (x0 , y0 ) − 2 xs ys dBs − xs d < y, B >s +2 xs ys dBs
0 0 0
Z t
+ ys d < x, B >s + < x >t + < y >t
0
Z t Z t
=f (x0 , y0 ) − x2s ds − ys2 ds+ < x >t + < y >t = f (x0 , y0 ) ,
0 0
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 41

which is independent of t. 

R∞
Exercise 58 1. Let (ht ) be a deterministic real valued process with 0 (hs )2 ds <
∞. Let (Bt ) be an Ft -Brownian motion on (Ω, F, Ft , P ). Prove that
Z t
1 t
Z 
2
exp hs dBs − (hs ) ds
0 2 0

is a martingale.

2. Let (ht ) be a bounded continuous and adapted real valued stochastic process.
Prove that Z t
1 t
Z 
2
exp hs dBs − (hs ) ds
0 2 0
is a martingale.

3. Let Q, P be two equivalent martingale measures on Ft with


Z t
1 t
Z 
dQ 2
= exp hs dBs − (hs ) ds .
dP 0 2 0
Rt
Define B̃t = Bt − 0 hs ds for (hs ) given in part (1). Prove that (B̃t ) is an
(Ft )-Brownian motion with respect to Q.
Solution: Let Z t Z t 
1 2
Mt := exp hs dBs − (hs ) ds
0 2 0
By Itô’s formula, Mt satisfies the equation dxt = ht xt dBt with initial value 1 and
so (Mt ) is a local martingale. That it is a martingale follows from the following
arguments.
First proof. We use Novikov’s condition: If exp( 21 hN, N it ) is finite then
exp Nt − 21 hN, N it is a martingale. In our case
 Z ·   Z t 
1 1 2
E exp hs dBs = E exp (hs ) ds < ∞
2 0 t 2 0

in either case (1) or (2).


Second proof for part (1). Note that
Z t
hM, M it = (hs )2 (Ms )2 ds.
0
Stochastic Analysis (2014). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 42

Since Mt − 1 is a local martingale starting from 0, we apply Burkholder-Davis-


Gundy inequality,

E(Mt )2 ≤ 2E(Mt − 1)2 + 2 ≤ cEhM, M it + 2.

Thus Z t Z t
2
EhM, M it ≤ 2 (hs ) ds + c (hs )2 EhM, M is ds.
0 0
By a version of Grownall’s inequality,
Z t  Z t 
2 2
EhM, M it ≤ 2 (hs ) ds exp c (hr ) dr .
0 0

This proves (1). Rt


Second proof for part (2). From Mt = 1 + 0 hs Ms dBs ,
Z t
2
E(Mt ) ≤ 2 + 2 E(hs Ms )2 ds.
0
2
If (hs ) is bounded, E(Mt )2 ≤ 2e2|h|∞ t and this proves that (Mt ) is an L2 bounded
martingale for case (2). Rt
To prove the question (3), let us define Nt = 0 hs dBs . Then
 
dQ 1
= exp Nt − < N >t ,
dP 2

and B̃t = Bt − < B, N >t . First observe that the exponential martingale of
(Nt ) is a martingale, c.f. part (1) and Q is a probability measure, equivalent to
P . By Girsanov theorem, (B̃t ) is an Ft local -martingale with respect to Q. Since
< B̃ >t =< B >t = t, it follows from the Lévy characterisation theorem that (B̃t )
is a Brownian motion with respect to Q. 

Das könnte Ihnen auch gefallen