Beruflich Dokumente
Kultur Dokumente
Chapter 1. Preliminaries
1. A FS , t 0, A {T t} = (A {S t}) {T t}, since {T t} {S t}. Since
A {S t} Ft and {T t} Ft , A {T t} Ft . Thus FS FT .
2. Let = N and F = P(N) be the power set of the natural numbers. Let Fn = ({2}, {3}, . . . , {n+
1}), n. Then (Fn )n1 is a filtration. Let S = 3 13 and T = 4. Then S T and
(
{3} if n = 3
{ : S() = n} =
otherwise
(
if n = 4
{ : T () = n} =
otherwise
Hence {S = n} Fn , {T = n} Fn , n and S, T are both stopping time. However { : T S =
1} = { : 1{3} () = 1} = {3}
/ F1 . Therefore T S is not a stopping time.
3. Observe that {Tn t} Ft and {Tn < t} Ft for all n N , t R+ , since Tn is stopping
time and we assume usual hypothesis. Then
(1) supn Tn is a stopping time since t 0, {supn Tn t} = n {Tn t} Ft .
(2) inf n Tn is a stopping time since {inf n Tn < t} = {Tn < t} Ft
(3) lim supn is a stopping time since lim supn = inf m supnm Tn (and (1), (2).)
(4) lim inf n is a stopping time since lim inf n = supm inf nm Tn (and (1), (2).).
4. T is clearly a stopping time by exercise 3, since T = lim inf n Tn . FT FTn , n since T Tn ,
and FT n FTn by exercise 1. Pick a set A n FTn .n 1, A FTn and A {Tn t} Ft .
Therefore A {T t} = A (n {Tn t}) = n (A {Tn t}) Ft . This implies n FTn FT
and completes the proof.
5. (a) By completeness of Lp space, X LP . By Jensens inequality, E|Mt |p = E|E(X|Ft )|p
E [E(|X|p |Ft )] = E|X|p < for p > 1.
(b) By (a), Mt Lp L1 . For t s 0, E(Mt |Fs ) = E(E(X|Ft )|Fs ) = E(X|Fs ) = Ms a.s. {Mt }
is a martingale. Next, we show that {Mt } is continuous. By Jensens inequality, for p > 1,
n
n
E|Mtn Mt |p = E|E(M
X|Ft )|p E|M
X|p ,
t 0.
(1)
p
1
p
supt E|Mtn Mt |p
n
n
p
P sup |Mt Mt | > p E(sup |Mt Mt | )
0. (2)
p1
p
t
t
Therefore M n converges to M uniformly in probability. There exists a subsequence {nk } such that
Mnk converges uniformly to M with probability 1. Then M is continuous since for almost all , it
is a limit of uniformly convergent continuous paths.
1
6. Let p(n) denote a probability mass function of Poisson distribution with parameter t. Assume
t is integer as given.
t
X
(t)n
= 2tet
(t n)
n!
n=0
t
X
(t)n
n!
n=0
t
X
(t n)p(n)
n=0
t1
X
n=0
(t)n
n!
!
(3)
(t)t
=2et
(t 1)!
7. Since N has stationary increments, for t s 0,
2
E(Nt Ns )2 = ENts
= V ar(Nts ) + (ENts )2 = (t s)[1 + (t s)].
(4)
i=n+1
i=n+1
, Since i=1 1/i2 < , as n, m , E(Stn Stm )2 . {Stn }n is Cauchy and its limit Mt is
well defined for all t 0.
(b)First, recall Kolmogorovs
convergence criterion:
Suppose {Xi }i1 is a sequence of independent
P
P
random variables. If i V ar(Xi ) < , then i (Xi EXi ) converges a.s.
For all i and t, 4Mti = (1/i)4Nti and 4Mti > 0. By Fubinis theorem and monotone convergence
theorem,
X
st
4Ms =
XX
st i=1
4Msi
X
X
i=1 st
4Msi
X
X
4N i
s
i=1 st
X
Ni
t
i=1
(6)
2 and hence
V
ar(X
)
=
1/i
i
i=1 V ar(Xi ) < . Therefore
P
P Kolmogorovs criterionPimplies that
X
converges
almost
surely.
On
the
other
hand,
i=1 i
i=1 t/i = . Therefore,
st 4Ms =
P
i
i=1 Nt /i = .
2
P
P
10. (a) Let Nt = i 1i (Nti t) and Lt = i 1i (Lit t). As we show in exercise 9(a), N , M are
well defined in L2 sense. Then by linearity of L2 space M is also well defined in L2 sense since
Mt =
X1
i
X1
X1 i
(Nti t) (Lit t) =
(Nt t)
(Li t) = Nt Lt .
i
i
i t
i
(7)
Both terms in right size are martingales change only by jumps as shown in exercise 9(b). Hence
Mt is a martingale which changes only by jumps.
P
(b) First show that given two independent Poisson processes N and L, s>0 4Ns 4Ls = 0 a.s.,
i.e. N and L almost P
surely dont jump P
simultaneously. Let {Tn }n1 be aP
sequence of jump times
of a process L. Then s>0 4Ns 4Ls = n 4NTn . We want to show that n 4NTn = 0 a.s. Since
4NTn 0, it is enough to show that E4NTn = 0 for n 1.
Fix n 1 and let Tn be a induced probability measure on R+ of Tn . By conditioning,
Z
Z
E(4NTn ) = E [E (4NTn |Tn )] =
E (4NTn |Tn = t) Tn (dt) =
E (4Nt ) Tn (dt), (8)
0
where last equality is by independence of N and Tn . It follows that E4NTn = E4Nt . Since
4Nt L1 and P (4Nt = 0) = 1 by problem 25, E4Nt = 0, hence E4NTn = 0.
Next we show that the previous claim holds even when there are countably many Poisson processes.
assume that there exist countably many independent Poisson processes {N i }i1 . Let A be a
set on which more than two processes of {N i }i1 jump simultaneously. Let ij denotes a set on
which N i and N j dont jump P
simultaneously. Then P (ij ) = 1 for i 6= j by previous assertion.
c
c
Since A i>j ij , P (A)
i>j P (ij ) = 0. Therefore jumps dont happen simultaneously
almost surely.
Going back to the main proof, by (a) and the fact that N and L dont jump simultaneously, t > 0,
X
X
X
|4Ms | =
|4Ns | +
|4Ls | =
a.s.
(9)
st
st
st
11. Continuity: We use notations adopted in Example 2 in section 4 (P33). Assume E|U1 | < .
By independence of Ui , elementary inequality, Markov inequality, and the property of Poisson
process, we observe
X
lim P (|Zt Zs | > ) = lim
P (|Zt Zs | > |Nt Ns = k)P (Nt Ns = k)
st
lim
st
st
X
k
k
X
P(
i=1
Xh
i
kP (|U1 | > ) P (Nt Ns = k)
st
k
E|U1 | X 2
E|U1 |
k P (Nt Ns = k) = lim
{(t s)} = 0
st
st
lim
(10)
PNt
PNs
E eiu(Zt Zs )+ivZs = E eiu k=Ns +1 Uk +iv k=1 Uk
PNt
PNs
=E E eiu k=Ns +1 Uk +iv k=1 Uk |Fs
(11)
PNt
PNs
PNs
PNt
= E eiv k=1 Uk E eiu k=Ns +1 Uk
=E eiv k=1 Uk E eiu k=Ns +1 Uk |Fs
PNt
PNs
=E eiv k=1 Uk E eiu k=Ns +1 Uk = E eivZs E eiu(Zt Zs ) .
This shows that Z has independent increments.
Stationary increment: Since {Uk }k are i.i.d and independent of Nt ,
"Z
N (t) # X
Z
n
iuZt
iuZt
(t)n et
iux
iux
E e
=E E e
|Nt = E
e F (dx)
=
e F (dx)
n!
n0
(12)
= exp t
[1 eiux ]F (dx)
.
By (11) and (12), set v = u,
iuZt
iuZs
iu(Zt Zs )
iux
E e
=E e
/E e
= exp (t s)
[1 e ]F (dx)
=E eiu(Zts )
(13)
X
X
E|Zt tEU1 | E(E(|Zt ||Nt )) + tE|U1 |
E
|Ui ||Nt = n P (Nt = n) + tE|U1 |
(14)
n=1
i=1
=E|U1 |ENt + tE|U1 | = 2tE|U1 | < ,
For t s, E(Zt |Fs ) = E(Zt Zs + Zs |Fs ) = Zs + E(Zts ). Since
!
n
X
X
Ui |Nt = n P (Nt = n) = tEU1
EZt = E(E(Zt |Nt )) =
E
n=1
(15)
i=1
E(Zt EU1 t|Fs ) = Zs EU1 s a.s. and {Zt EU1 t}t0 is a martingale.
13. By Levy decomposition theorem and hypothesis, Zt can be decomposed as
Z
Z
Zt =
xNt (, dx) + t(
x(dx)) = Zt0 + t
R
|x|1
(16)
R
R
R
0
where Zt0 = R xNt (, dx), = |x|1 x(dx). By theorem 43, E(eiuZt ) = R (1eiux )(dx). Zt0 is
R
aRcompound Poisson process (See problem 11). Arrival rate (intensity)Ris since E( R NRt (, dx)) =
0
0
t R (dx)
t = t.
R = t. Since Zt is a martingale, EZ
R Since EZt = E( R xNt (, dx)) = t R x(dx),
0
= R x(dx). It follows that Zt = Zt Rt R x (dx) is a compensated compound Poisson
process. Then problem 12 shows that EU1 = R x (dx). It follows that the distribution of jumps
= (1/).
14. Suppose ENt < . At first we show that Zt L1 .
N
!
n
!
X
n
t
X
X
X
X
E|Zt | E
|Ui | =
E
|Ui | P (Nt = n) =
E (|Ui |) P (Nt = n)
n=0
i=1
sup E|Ui |
i
n=0 i=1
i=1
(17)
n=0
Then
E(Zt |Fs ) = E(Zs +
X
Ui 1{s<Ti t} |Fs ) = Zs + E(
Ui 1{s<Ti t} |Fs )
i=1
=Zs +
=Zs +
X
i=1
i=1
i=1
a.s.
i=1
since E(Ui |Fs )1{Ti >s} = 0 a.s. Note that if we drop the assumption ENt < , Zt is not a martingale
in general. (Though it is a local martingale.)
15. By Levy decomposition theorem (theorem 42),
Z
X
Zt =
x(Nt (, dx) t(dx)) + t +
4Xs 1{|4Xs |1} .
|x|<1
(19)
0<st
R
The first term in right side is a martingale (theorem 41). Then = |x|>1 x(dx) since Zt is a
R
P 2
martingale. Note that is well defined byRa condition
k k < . Hence Zt = R x(Nt (, dx)
t(dx)). Let A = R\{k }k . Then EN1A = A (dx) = 0 by theorem 38. Therefore mass of N (, dx)
is concentrated on countable set {k }k . Fix k 1 and let NtRk denote Nt (, k ). Then Ntk is a
Poisson process and its rate is k since by theorem 38, EN1k = {k } (dx) = k . Therefore,
Zt = k (Ntk k t)
(20)
2
Pn
P
t
= t nk=m k2 k 0 as n, m
To verifyPthat Zt L2 , we observe that E
k=m k (Nk k t)
2
2
2
2
since
k=1 k < . Therefore, Zt is a Cauchy limit in L . Since L -space is complete, Zt L .
(21)
d
This shows Wt is Gaussian. Wt has stationary increments because Wt Ws = B1s B1t = Bts .
Let Gt be a natural filtration of Wt . Then Gt = ((B1 B1s ) : 0 s t). By independent
increments property of Bt , Gt is independent of F1t . For s > t, Ws Wt = (B1t B1s ) F1t
and hence independent of Gt .
17. a) Fix > 0 and such that X () has a sample path which is right continuous, with left
limits. Suppose there exists infinitely many jumps larger than at time {sn }n1 [0, t]. (If there
are uncountably many such jumps, we can arbitrarily choose countably many of them.) Since [0, t]
is compact, there exists a subsequence {snk }k1 converging to a cluster point s [0, 1]. Clearly
we can take further subsequence converging to s [0, 1] monotonically either from above or from
below. To simplify notations, Suppose {sn }n1 s . ( The other case {sn }n1 s is similar.)
By left continuity of Xt , there exists > 0 such that s (s , s ) implies |Xs Xs | < /3 and
|Xs Xs | < /3. However for sn (s , s ),
|Xsn Xs | = |Xsn Xs + 4Xsn | |4Xsn | |Xsn Xs | >
2
3
(22)
R1
21. a) let a = (1 t)1 ( t Y (s)ds) and Let Mt = Y ()1(0,t) () + a1[t,1) (). For arbitrary
B B([0, 1]), { : Mt B} = ((0, t) {Y B}) ({a B} (t, 1)). (0, t) {Y B} (0, t) and
hence in Ft . {a B} (t, 1) is either (t, 1) or depending on B and in either case in Ft . Therefore
Mt is adapted.
Pick A Ft . Suppose A (0, t). Then clearly E(Mt : A) = E(Y : A). Suppose A (t, 1). Then
Z 1
1
E(Mt : A) = E(Y : A (0, t)) + E
Y (s)ds : (t, 1)
1t t
(24)
Z 1
=E(Y : A (0, t)) +
Y (s)ds = E(Y : A (0, t)) + E(Y : (t, 1)) = E(Y : A).
t
c) From b), Mt () = Y ()1(0,t) () + 1/(1 )1 Y (t)1(t,1) (). Fix (0, 1). Since 0 < < 1/2,
Y /(1 ) > Y and Y is a increasing on (0, 1). Therefore,
Y ()
Y ()
Y (t)
sup Y () =
Y () =
(25)
sup Mt = sup
1
1
0<t<1
0<t< 1
t<1
For each (0, 1), Mt () = Y () for all t and especially M () = Y (). Therefore
k sup M kL2 =
t
22.
1
1
kY kL2 =
kM kL2
1
1
a) By simple computation,
1
1t
R1
t
s1/2 ds = 2/(1 +
(26)
b) Since T is a stopping time, {T > } F and hence {T > } (0, ] or {T > } (, 1) for
any > 0. Assume {T > } (0, ] for all > 0. Then T () on (, 1) for all > 0 and it
follows that T 0. This is contradiction. Therefore there exists 0 such that {T > 0 } (0 , 1).
Fix (0, 0 ). Then {T > } {T > 0 } (0 , 1). On the other hand, since T is a stopping time,
{T > } (0, ] or {T > } (, 1). Combining these observations, we know that {T > } (, 1)
for all (0, 0 ). (0, 0 ), there exists > 0 such that > 0 and {T > } ( , 1).
Especially T () > . Taking a limit of 0, we observe T () on (0, 0 ).
c) Using the notation in b),
T () on (0, 0 ) and hence MT () = 1/2 on (0, 0 ). Therefore,
R
0
EMT2 E(1/ : (0, 0 )) = 0 1 d = . Since this is true for all stopping times not identically
equal to 0, M cannot be locally a L2 martingale.
d) By a), |Mt ()| 1/2 2 and M has bounded path for each . If MT 1{T >0} were a bounded
random variable, then MT would be bounded as well since M0 = 2. However, from c) MT
/ L2
unless T 0 a.s. and hence MT 1{T >0} is unbounded.
23. Let M be a positive local martingale and {Tn } be its fundamental sequence. Then for t s
0, E(MtTn 1{Tn >0} |Fs ) = MsTn 1{Tn >0} . By applying Fatous lemma, E(Mt |Fs ) Ms a.s. Therefore
7
(27)
(28)
n
\
C=
{Ztj Bj } : n N, tj [0, ), Bj B(R) .
(30)
j=1
Then C is a -system such that (C) N = G . Therefore by Dynkins theorem, provided that
C L, (C) L and thus G L. For arbitrary A GT G , 1A = E[1A |GT ] = E[1A |H] H and
hence A H.
It remains to show that C L. Fix n N. Since H GT , it suffices to show that
Y
n
E
1{Ztj Bj } GT H.
j=1
P is called a predictable -algebra. Its definition and properties will be discussed in chapter 3.
(31)
Y
n
j=1
Let =
E
Y
n+1
X
Y
1{Ztj Bj } GT =
1{T [tk1 ,tk )}
1{Zt T Bj } E
1{Zt T Bj } GT .
j
j
k=1
Q
jk
Y
jk
1{Zt
j T T
1{Zt
Bj }
j<k
jk
GT = E T GT = EZT [].
T Bj }
(32)
This verifies (31) and completes the proof. (This solution is by Jason Swanson).
c) Since GT GT and ZT 1{T <} GT , {GT , ZT } GT if T < a.s. To show the converse,
observe that ZtT = ZtT + 4ZT 1{tT } . Since ZtT , ZT , 1{tT } GT for all t 0, ZtT
(GT , ZT ) for all t 0. Therefore GT = (GT , ZT ).
d) Since N GT , ZT = ZT a.s. for T < implies GT = (GT , ZT ) = (GT ) = GT .
25. Let Z be a Levy process. By definition Levy process is continuous in probability, i.e. t,
limn P (|Zt ZT 1/n | > ) = 0. > 0, t > 0, {|4Zt | > } = n nm {|Zt ZT 1/n | > }.
Therefore,
P (|4Zt | > ) lim inf P (|Zt ZT 1/n | > ) = 0
n
(33)
29. Let Mt be a Levy process and local martingale. Mt has a representation of the form
Z
Z
Mt = Bt +
x [Nt (, dx) t(dx)] + t +
xNt (, dx)
(34)
First two terms are martingales. Therefore WLOG, we can assume that
Z
xNt (, dx)
Mt = t +
(35)
{|x|1}
{|x|>1}
{|x|>1}
Mt has only finitely many jumps on each interval [0, t] by exercise 17 (a). Let {Tn }n1 be a sequence
of jump times of Mt . Then P (Tn < ) = 1, n and Tn . We can express Mt by a sum of
compound Poisson process and a drift term (See Example in P.33):
Mt =
Ui 1{tTi } t.
(36)
i=1
(37)
T1 = T . Then
since MtT1 = Mt 1{t<T1 } + MT1 1{tT1 } = (t T1 ) and M
1
(38)
Therefore,
E|MSn T1 | =E|U1 1{Sn T1 } (Sn T1 )| E|U1 1{Sn T1 } | E|Sn T1 |
=E[E(|U1 |1{Sn T1 } |(T1 ) N )] E|Sn T1 |
=E[1{Sn T1 } E(|U1 ||(T1 ) N )] E|Sn T1 |
(39)
10
So that
(
1
YR R () =
0
R t , Zt < z y
otherwise
(
, YR0 R () =
1
0
R t , Zt > z + y
otherwise
(41)
(42)
a z, s < t.
(43)
By taking expectation,
P0 (Tz t, Zt < z y) = E(E0 (YR R |FR ) : R < ) = E(EZR YR : R < )
E(EZR YR0 : R < ) = E(E0 (YR0 R |FR ) : R < ) = P0 (Tz t, Zt > z + y)
(44)
=P (Zt > z + y)
31. We define Tz , R, (t) as in exercise 30. We let
(
1
s t , (t s) z
Ys () =
0
otherwise
(45)
1
2
a z, s < t.
(46)
11
dQ
lim EQ [|Xn X| 1] = lim EP |Xn X| 1
=0
n
n
dP
|Xn X| 1 0 in L1 (Q) implies that Xn X in Q-probability.
3. Bt is a continuous local martingale by construction and by independence between X and Y ,
[B, B]t = 2 [X, X]t + (1 2 )[Y, Y ]t = t
So by Levy theorem, B is a standard 1-dimensional Brownian motion.
p
[X, B]t = t,
[Y, B]t = 1 2 t
4. Assume that f (0) = 0. Let Mt = Bf (t) and Gt = Ff (t) where B is a one-dimensional standard
Brownian motion and Ft is a corresponding filtration. Then
12
6. Pick arbitrary t0 > 1. Let Xtn = 1(t0 1 ,) (t) for n 1, Xt = 1[t0 ,) , Yt = 1[t0 ,) . Then
n
X n , X, Y are finite variation processes and Semimartingales. limn Xtn = Xt almost surely. But
lim[X n , Y ]t0 = 0 6= 1 = [X, Y ]t0
n
7. Observe that
[X n , Z] = [H n , Y Z] = H n [Y, Z],
Note that [[X, X], Y ] 0 since X and Y are continuous semimartingales and especially [X, X] is a
finite variation process. As n , (fn (X0 )f (X0 ))Y0 0 for arbitrary . Since [X, Y ] has a
path of finite variation on compacts, we can treat fn (X) [X, Y ], f (X) [X, Y ] as Lebesgue-Stieltjes
integral computed path by path. So fix . Then as n ,
sup |fn (Bs ) f (Bs )| =
0st
sup
inf st Bs xsupst Bs
Z t
0
0
(fn (Xs ) f (Xs ))d[X, Y ]s sup |fn (Bs ) f (Bs )|
d|[X, Y ]|s 0
0st
0
X
n
n
n
[M, A] = M0 A0 + lim
M Ti+1 M Ti
ATi+1 ATi
n
i n
X n
n
n
Ti+1
0 + lim sup M Ti+1 M Ti
ATi = 0
A
n
P n
n
since M is a continuous process and i ATi+1 ATi < by hypothesis. Similarly
n
X n
n
n
[A, A] = M02 + lim
ATi+1 ATi
ATi+1 ATi
n
n
X n
n
n
Ti+1
0 + lim sup ATi+1 ATi
ATi = 0
A
n
Therefore,
[X, X] = [M, M ] + 2[M, A] + [A, A] = [M, M ]
13
xK
st
Since this is true for all t R+ and almost all , the claim is shown.
13. By definition, stochastic integral is a continuous linear mapping JX : Sucp Ducp . (Section
4). By continuity, H n H under ucp topology implies H n X H X under ucp topology.
14. Let A = 1 + A to simplify notations. Fix and let A1 X = Z. Then Z () < by
hypothesis and A Z = X. Applying integration by parts to X and then device both sides by A
yields
Z
1 t
Xt
Zs dAs
= Zt
A
A 0
Since Z < exists, for any > 0, there exists such that |Zt Z | < for all t . Then for
t > ,
Z t
Z
Z t
1
1
1
At A
Zs dAs =
Zs dAs +
(Zs Z )dAs + Z
At 0
At 0
At
At
Lets evaluate right hand side. As t , the first term goes to 0 while the last term converges
to Z by hypothesis. By construction of , the second term is smaller than (At A )/At , which
converges to . Since this is true for all > 0,
Z
1 t
lim
Z dA = Z
0 s s
t A
Thus limt Xt /At = 0.
14
15. If M0 = 0 then by Theorem 42, there exists a Brownian motion B such that Mt = B[M,M ]t .
By the law of iterated logarithm,
B[M,M ]t
Mt
B
= lim
= lim
=0
t [M, M ]t
t [M, M ]t
lim
Z t1/n
t
n
n
|Xt Xs | = n Y d
Y d 2n(t s) sup |Y |
s
s1/n
s 1 t
n
Since X is constant on [0, t] and [1, ), let be a partition of [t, 1] and M = sups 1 t |Y | < .
n
Then for each n, total variation of X n is finite since
X
sup |Xtni+1 Xtni | 2nM
(ti+1 ti ) = 2nM (1 t)
1
Ztn = exp Xtn [X n , X n ]t = exp(Xtn ),
2
since X n is a finite variation process. On the other hand, Zt = exp(Bt t/2) and
lim Ztn = exp(Bt ) 6= Zt
19. An is a clearly c`adl`ag and adapted. By periodicity and symmetry of triangular function,
Z
Z
Z
Z
2
2n
2
1 2
1
n
|dAs | =
|d sin(ns)| = n
d(sin(ns)) =
d(sin(s)) = 1
n 0
n
0
0
0
Since total variation is finite, Ant is a semimartingale. On the other hand,
lim sup |Ant | = lim
n 0t
2
1
=0
n
15
1
u(Bs ) dBs +
2
X
1
4u(Bs )ds =
+
kxk
i=1
Z
0
i u(Bs )dBsi
and Mt = u(Bt ) is a local martingale. This solves (a). Fix 1 3. Observe that E(u(B0 ) ) <
. Let p, q be a positive number such that 1/p + 1/q = 1. Then
Z
kxyk2
1
1
x
2t
E (u(Bt ) ) =
e
dy
3/2
R3 kyk (2t)
Z
Z
kxyk2
kxyk2
1
1
1
1
2t
2t
=
e
e
dy
+
dy
3/2
3/2
{B(0;1)B c (x;)} kyk (2t)
R3 \{B(0;1)B c (x;)} kyk (2t)
Z
kxyk2
1
1
2t
dy
sup
3 e
B(0;1) kyk
y{B(0;1)B c (x;)} (2t) 2
Z
!1/p Z
q !1/q
kxyk2
1
1
+
dy
e 2t
dy
p
3/2
R3 \{B(0;1)B c (x;)} kyk
R3 \{B(0;1)B c (x;)} (2t)
Pick p > 3/ > 1. Then the first term goes to 0 as t . In particular it is finite for all t 0.
The first factor in the second term is finite while the second factor goes to 0 as t since
Z
Z
kxyk2
kxyk2
1
1
1
1
1
1
1
2t/q
2t/q
e
e
dy =
dy =
3/2
3(q1)/2
3(q1)/2
3/2
3/2
3(q1)/2
3/2
(2t) (2t)
(2t)
q (2t/q)
(2t)
q
and the second factor is finite for all t 0. (b) is shown with = 1. (c) is shown with = 2. It
also shows that
lim E x (u((Bt )2 ) = 0
0<s<t
t
(A As )dCs =
(A As )dCs +
4As 4Cs
0<s<t
(A
0
As )dCs
(C Cs )dAs
s1
(48)
sp1
For k = 1, equation (48) clearly holds. Assume that it holds for k = n. Then
Z
Z
Z
Z
dAsk+1 = (k + 1)
(A As1 )dAs1 = (A As )k+1 , (49)
dAs2 . . .
(k + 1)!
dAs1
s
s1
sp1
since A is non-decreasing continuous process, (A A) A is indistinguishable from LebesgueStieltjes integral calculated on path by path. Thus equation (48) holds for n = k + 1 and hence for
all integer n 1. Then setting s = 0 yields a desired result since A0 = 0.
24.
a)
Z
0
Z t
1
1 B1 Bs
dBs
ds
0 1s
0 1s 1s
Z t
Z t
Z t
1
1
Bs
=
dBs B1
ds +
ds
2
1
s
(1
s)
(1
s)2
0
0
0
Z t
Z t
Z t
1
1
1
dBs B1
ds +
Bs d
=
2
1s
0 (1 s)
0
0 1s
Bt
1
1
1
=
, B B1
1t
1s
1t 1
t
Bt B1
=
+ B1
1t
1
ds =
1s
1
Xt = (1 t)
ds =
0 1s
Z t
Xs
=t +
ds,
1s
0
Rs
0 (1
u)1 du ]t = 0,
1
(1 s)
ds +
1s
Z tZ
0
1
du (1)ds
1u
Z t
Xt = et X0 +
es dBs
0
17
28. By the law of iterated logarithm, lim supt Btt = 0 a.s. In particular, for almost all there
exists t0 () such that t > t0 () implies Bt ()/t < 1/2 for any (0, 1/2). Then
Bt 1
lim E(Bt ) = lim exp t
lim et = 0,
a.s.
t
t
t
t
2
29. E(X)1 = E(X + [X, X]) by Corollary of Theorem 38. This implies that E(X)1 is the
solution to a stochastic differential equation,
Z t
1
E(X)t = 1 +
E(X)1
s d(Xs + [X, X]s ),
0
which is the desired result. Note that continuity assumed in the corollary is not necessary if we
assume 4Xs 6= 1 instead so that E(X)1 is well defined.
Rt
30. a) By Itos formula and continuity of M , Mt = 1 + 0 Ms dBs . Bt is a locally square
integrable local martingale and M L. So by Theorem 20, Mt is also a locally square integrable
local martingale.
b)
Z
[M, M ]t =
Z
Ms2 ds =
e2Bs s ds
E[e
2Bs
]e
ds =
es ds <
a.s.
is G-martingale.
So M
18
k1 k
,
Ft B([0, t])
{X n B} = kN+ { : Xk/2n () B}
2n 2n
(50)
19
8.
Let S, T be stopping times such that S T . Then FSn FT . and n FSn FT . By the
same argument as in exercise 7, FT n FSn . Since FT = FT by hypothesis, we have desired
result. (Note: {Sn } is not necessarily an announcing sequence since Sn = T is possible. Therefore
FSn 6= FT in general.)
9.
Let X be a Levy process, G be its natural filtration and T be stopping time. Then by
exercises 24(c) in chapter 1, GT = (GT , XT ). Since X jumps only at totally inaccessible time
(a consequence of theorem 4), XT = XT for all predictable stopping time T . Therefore if T
is a predictable time, GT = (GT , XT ) = (GT , XT ) = GT since XT GT . Therefore a
completed natural filtration of a Levy process is quasi left continuous.
10.
As given in hint, [M, A] is a local martingale. Let {Tn } be its localizing sequence, that is
T
[M, A] n is a martingale for all n. Then E([X, X]Tt n ) = E([M, M ]Tt n )+E([A, A]Tt n ). Since quadratic
variation is non-decreasing non-negative process, first letting n and then letting t , we
obtain desired result with monotone convergence theorem.
11.
By the Kunita-Watanabe inequality for square bracket processes, ([X + Y, X + Y ])1/2
([X, X])1/2 + ([Y, Y ])1/2 , It follows that [X + Y, X + Y ] 2 ([X, X] + [Y, Y ]). This implies that
[X + Y, X + Y ] is locally integrable and the sharp bracket process hX + Y, X + Y i exists. Since
sharp bracket process is a predictable projection (compensator) of square bracket process, we obtain
polarization identity of sharp bracket process from one of squared bracket process. Namely,
1
hX, Y i = (hX + Y.X + Y i hX, Xi hY, Y i)
2
(51)
Then the rest of the proof is exactly the same as the one of theorem 25, chapter 2, except that we
replace square bracket processes by sharp bracket processes.
12. Since a continuous finite process with finite variation always has a integrable variation, without
loss of generality wePcan assume that the value ofR A changes only by jumps. Thus A can be
20
lemma Let Xt be a c`
adl`
ag adapted process and predictable. Then there exists a sequence of
strictly positive predictable times {Tn } such that [4X 6= 0] n [Tn ].
1/k
1/k
Proof. Let Sn+1 = inf{t : t > Tn (), |XS 1/k Xt | > 1/k or |XS 1/k Xt | > 1/k}. Then since
n
n
X is predictable, we can show by induction that {Sn }n1 is predictable stopping time. In addition
1/k
[4X 6= 0] n,k1 [Sn ]. Then by previous lemma, there exists a sequence of predictable stopping
1/k
times {Tn } such that n,k1 [Sn ] n [Tn ]. Then [4X 6= 0] n [Tn ]
Proof of the main claim: Combining two lemmas, we see that {4A
P 6= 0} is the union
of a sequence of disjoint graphs of predictable stopping times. Since At = st 4As is absolute
convergent
P for each , it is invariant with respect to the change of the order of summation. Therefore
At = n 4ASn 1{Sn t} where Sn is a predictable time. 4ASn 1{Sn t} is a predictable process since
Sn is predictable and ASn FSn . This clearly implies that |4ASn |1{Sn t} is predictable. Then
C is predictable as well.
Note: As for the second lemma, following more general claim holds.
lemma Let Xt be a c`
adl`
ag adapted process. Then X is predictable if and only if X satisfies
the following conditions (1). There exists a sequence of strictly positive predictable times {Tn } such
that [4X 6= 0] n [Tn ]. (2). For each predictable time T , XT 1{T <} FT .
13. Let {Ti }i be a jump time of a counting process and define Si = Ti Ti1 . Then by corollary
to theorem 23, a compensator A is given by
Z s
i
X X
1
dFi (u),
(52)
At =
j (Sj ) + i+1 (t Ti ) 1{Ti t<Ti+1 } ,
i (s) =
0 Fi (u)
i1
j=1
where Fi (u) = P (Si > u). For each , it is clear that If Fi has a density such that dFi (u) = fi (u)
then At is absolutely continuous. Conversely if Fk (u) does not admit density then on [Tk1 , Tk ),
At is not absolutely continuous.
14.
Nt 1{tT } = 0 is a trivial martingale. Since Doob-Meyer decomposition is unique, it
suffices to show that 1{tT } is a predictable process, 1{tT } is a predictable process if and only
if T is a predictable time. Let Sn = T (1 1/n). Then T F0 implies Sn F0 . In particular
{Sn t} F0 Ft and Sn is a stopping time such that Sn < T and Sn T . This makes {Sn } an
announcing sequence of T . Thus T is predictable.
15. By the uniqueness of Doob-Meyer decomposition, it suffices to show that Nt t is a martingale. since t is clearly a predictable process with finite variation. Let Ct be a Poisson process
associated with Nt . Then by the independent and stationary increment property of compound
21
Poisson process,
Cts
E[Nt Ns |Fs ] = E
Ui =
i=1
Cts
Ui |Cts = k P (Cts = k)
i=1
k=1
(53)
kP (Cts = k) = (t s)
k=1
By direct computation,
1
1 eh
P (t T < t + h|T t) = lim
= .
h0 h
h0
h
(t) = lim
(54)
# (t) = lim
(56)
18.
There exists a disjoint sequence of predictable times {Tn } such that A = {t > 0 :
4hM, M it 6= 0} n [Tn ]. (See the discussion in the solution of exercise 12 for details.) In
addition, all the jump times of hM, M i is a predictable time. Let T be a predictable time such
that hM, M iT 6= 0. Let N = [M, M ] hM, M i. Then N is a martingale with finite variation
since hM, M i is a compensator of [M, M ]. Since T is predictable, E[NT |FT ] = NT . On the
other hand, since {Ft } is a quasi-left-continuous filtration, NT = E[NT |FT ] = E[NT |FT ] = NT .
This implies that 4hM, M iT = 4[M, M ]T = (4MT )2 . Recall that M itself is a martingale. So
MT = E[MT |FT ] = E[MT |FT ] = MT and 4MT = 0. Therefore 4hM, M iT = 0 and hM, M i is
continuous.
19.
By theorem 36, X is a special semimartingale. Then by theorem 34, it has a unique
decomposition X = M + A such that M is local martingale and A is a predictable finite variation
process. Let X = N + C be an arbitrary decomposition of X. Then M N = C A. This implies
that A is a compensator of C. It suffices to show
R t that a local martingale with finite variation
is locally integrable. Set Y = M N and Z = 0 |dYs |. Let Sn be a fundamental sequence of
Y and set Tn = Sn n inf{t : Zt > n}. Then YTn L1 (See the proof of theorem 38) and
ZTn n + |YTn | L1 . Thus Y = M N is has has a locally integrable variation. Then C is a sum
of two process with locally integrable variation and the claim holds.
22
20. Let {Tn } be an increasing sequence of stopping times such that X Tn is a special semimartin
= supst |XsTn | is locally integrable.
gale as shown in the statement. Then by theorem 37, XtT
n
Namely, there exists an increasing sequence of stopping times {Rn } such that (XtT
)Rn is inten
grable. Let Sn = Tn Rn . Then Sn is an increasing sequence of stopping times such that (Xt )Sn
is integrable. Then Xt is locally integrable and by theorem 37, X is a special semimartingale.
21.
Since Q P, dQ/dP > 0 and Z > 0. Clearly M L1 (Q) if and only if M Z L1 (P). By
generalized Bayes formula.
EQ [Mt |Fs ] =
EP [Mt Zt |Fs ]
EP [Mt Zt |Fs ]
,
=
EP [Zt |Fs ]
Zs
ts
(57)
i=0
Yt = E[Xt |Ft ] = E[Mt |Ft ] + E[At |Ft ]. Since E[E[Mt |Ft ]|Fs ] = E[Mt |Fs ] = E[E[Mt |Gs ]|Fs ]] =
E[Ms | f ils ], E[Mt |Ft ] is a martingale. Therefore
#
" n
n
X
X
(59)
E [|E[Ati Ati +1 |Fti ]|]
Var (Y ) = E
|E[E[Ati |Fti ] E[Ati +1 |Fti +1 ]|Fti ]| =
i=1
i=1
(60)
Thus for every and ti , E [|E[Ati Ati +1 |Fti ]|] E [|E[Ati Ati +1 |Gti ]|]. Therefore Var(X) <
(w.r.t. {Gt })implies Var(Y) < (w.r.t {Ft }) and Y is ({Ft }, P)-quasi-martingale.
23.
Lemma.
Let N be a local martingale. If E([N, N ] ) < or alternatively, N H1 then
N is uniformly integrable martingale.
This is a direct consequence of Feffermans inequality. (Theorem 52 in chapter 4. See chapter 4
section 4 for the definition of H1 and related topics.) We also take a liberty to assume BurkholderDavis-Gundy inequality (theorem 48 in chapter 4) in the following discussion.
Once we accept this lemma, it suffices to show that a local martingale [A, M ] H1 . By Kunita1/2
1/4
1/4
Watanabe inequality, [A, M ] [A, A] [M, M ] . Then by Holder inequality,
1/2 2
E [A, M ]1/2
E [A, A]
E [M, M ]1/2
.
23
(61)
1
1/2 2
By hypothesis E [A, A]
< . By BDG inequality and the fact that M is a bounded martin ] < for some positive constant c . This complete the proof.
gale, E([M, M ]1/2 ) c1 E[M
1
24.
Since we assume that the usual hypothesis holds throughout this book (see page 3), let
0
0 N . Since {T < t} = {T t < t} F , T
Ft = {T s : s t} and redefine Ft by Ft = Ft+
t
is F-stopping time. Let G = {Gt } be a smallest filtration such that T is a stopping time, that is a
natural filtration of the process Xt = 1{T t} . Then G F.
For the converse, observe that {T s B} = ({T s} {T B}) ({T > s} {s B}) Fs
since {s B} is or , {T B} FT and in particular {T s}{T B} Fs and {T > s} Fs .
Therefore for all t, T s, (s t) is Gt measurable. Hence Ft0 Gt . This shows that G F. (Note:
we assume that G satisfies usual hypothesis as well. )
25. Recall that the {(, t) : 4At () 6= 0} is a subset of a union of disjoint predictable times and
in particular we can assume that a jump time of predictable process is a predictable time. (See the
discussion in the solution of exercise 12). For any predictable time T such that E[4ZT |FT ] = 0,
4AT = E[4AT |FT ] = E[4MT |FT ] = 0
a.s.
(62)
26.
Without loss of generality, we can assume that A0 = 0 since E[4A0 |F0 ] = E[4A0 |F0 ] =
4A0 = 0. For any finite valued stopping time S, E[AS ] E[A ] since A is an increasing process.
Observe that A L1 because A is a process with integrable variation. Therefore A {AS }S is
uniformly integrable and A is in class (D). Applying Theorem 11 (Doob-Meyer decomposition) to
A, we see that M = A A is a uniformly integrable martingale. Then
0 = E[4MT |FT ] = E[4AT |FT ] E[4AT |FT ] = 0 E[4AT |FT ].
a.s.
(63)
(64)
Therefore Z is regular.
Conversely suppose that Z is regular and assume that A is not continuous at time T . Since
A is predictable, so is A and 4A. In particular, T is a predictable time. Then there exists an
announcing sequence {Tn } T . Since Z is regular,
0 = lim E[ZT ZTn ] = lim E[MT MTn ] lim E[AT ATn ] = E[4AT ].
n
(65)
Since A is an increasing process and 4AT 0. Therefore 4AT = 0 a.s. This is a contradiction.
Thus A is continuous.
24
31.
Let T be an arbitrary F stopping time and = { : XT () 6= XT ()}. Then by
Meyers theorem, T = T Tc where T is totally inaccessible time and Tc is predictable time.
By continuity of X, = and T = . Therefore T = Tc . It follows that all stopping times are
predictable and there is no totally inaccessible stopping time.
32. By exercise 31, the standard Brownian space supports only predictable time since Brownian
motion is clearly a strong Markov Feller process. Since O = ([S, T [: S, T are stopping time and S
T ) and P = ([S, T [: S, T are predictable times and S T ), if all stopping times are predictable
O = P.
35.
E(Mt ) 0 and E(Mt ) = exp [Bt 1/2(t )] e. So it is a bounded local martingale and
hence martingale. If E(M ) is a uniformly integrable martingale, there exists E(M ) such that
E[E(M )] = 1. By the law of iterated logarithm, exp(B 1/2 )1{ =} = 0 a.s. Then
h
1{ <} e1 < 1.
E[E(M )] = E exp 1
2
(66)
25