Sie sind auf Seite 1von 15

Poisson Process, Spike Train and All That

Zoran Nenadic
Division of Engineering and Applied Science
California Institute of Technology
Pasadena, CA 91125
1
Zoran Nenadic POISSON PROCESS 2
t1 t1 t2 t3 t4
time
#

o
f

c
u
s
t
o
m
e
r
s
t0
Figure 1: Counting process
1 Counting Process
Let Fig. 1 be a graph of the customers who are entering a bank. Every
time a customer comes, the counter is increased by one. The time of the
arrival of i th customer is t
i
. Since the customers are coming at random,
the sequence {t
1
, t
2
, , t
m
}, denoted shortly by {t
i
}, is a random sequence.
Also, the number of customers who came in the interval (t
0
, t] is a random
variable (process). Such a process is right continuous, as indicated by the
graph in Fig. 1.
As it is often case in the theory of stochastic processes, we assume that
the index set, i.e. the set where {t
i
} is taking values from, is T = [0, ).
Therefore, we have a sequence of non-negative random variables
0 t
0
< t
1
< t
2
< < t
m
as m .
WLOG
1
let t
0
= 0 and N
0
= 0, then
N
t
= max{n, t
n
t}, T = [0, ),
is called a point process (counting process), and is denoted shortly by
{N
t
, t 0}.
1
Without loss of generality
Zoran Nenadic POISSON PROCESS 3
Let T
n
t
n
t
n1
be inter-arrival time, then the sequence of inter-arrival
times {T
n
, n 1} is another stochastic process.
Special case is when {T
n
, n 1} is a sequence of i.i.d.
2
random variables,
then the sequence {t
n
} is called a renewal process. {N
t
, t 0} is the as-
sociated renewal point process, sometimes also called renewal process. Also,
keep in mind that t
n
= T
1
+ T
2
+ + T
n
.
Denition (Poisson process) A point process {N
t
, t 0} is called a Poisson
process if N
0
= 0 and {N
t
} satises the following conditions
1. its increments are stationary and its non-overlapping increments are
independent
2. P(N
t+t
N
t
= 1) = t + o(t)
3. P(N
t+t
N
t
2) = o(t)
Remarks
{N
t
, t T}; t, s T; t > s; N
t
N
s
- is the increment of stochas-
tic process N
t
.
N
t+t
N
t
= the number of new arrivals during (t, t +t].
= const > 0 and o(t) is understood as
o(t)
t
0 when t 0.
The Poisson process dened above is also known as homogeneous
Poisson process. In general can be a time dependent function (t),
in which case we are dealing with inhomogeneous Poisson process.
Finally, itself can be a realization of stochastic process (t, ), in
which case we have so-called doubly stochastic Poisson process.
In any case, the parameter of a Poisson process is called the rate and
sometimes the intensity of the process. Its dimension is [events]/[time]
(e.g. spikes/sec in neuroscience).
Theorem Let {N
t
, t 0} be a Poisson process, then
P(N
t
= k) =
( t)
k
k!
e
t
k = 0, 1, (1)
2
Independent identically distributed
Zoran Nenadic POISSON PROCESS 4
The expression on the left hand side of (1) represents the probability of k
arrivals in the interval (0, t].
Proof A generating function of a discrete random variable X is dened
via the following z-transform (recall that the moment generating function of
a continuous random variable is dened through Laplace transform):
G
X
(z) = E[z
X
] =

i=0
z
i
p
i
,
where p
i
= P(X = i). Let us assume that X is a Poisson random variable
with parameter , then
P(X = i) =

i
i!
e

i = 0, 1, 2,
and
G
X
(z) =

i=0
z
i

i
i!
e

= e
(z1)
. (2)
Going back to Poisson process, dene the generating function as
G
t
(z) E[z
Nt
]
Then we can write
G
t+t
(z) = E[z
N
t+t
] = E[z
Nt +N
t+t
Nt
] = E[z
Nt
] E[z
N
t+t
Nt
]
= G
t
(z) [(1 t + o(t)) z
0
+ ( t + o(t)) z
1
+ o(t)(z
2
+ )]
Furthermore
G
t+t
G
t
(z)
t
= G
t
(z) [ +
o(t)
t
+ ( +
o(t)
t
) z +
o(t)
t
(z
2
+ )]
lim
t0
G
t+t
G
t
(z)
t
= G
t
(z) [ + z]

dG
t
(z)
dt
= G
z
(t) (z 1)
log G
t
(z) log G
0
(z)
. .
= t (z 1)
0
G
t
(z) = e
t (z1)
Zoran Nenadic POISSON PROCESS 5
s s+t s+t s s 0
t
+
no arrivals
one arrival
Figure 2: Event description
Comparing this result to (2) we conclude that N
t
is a Poisson random vari-
able with parameter t.
Theorem If {N
t
, t 0} is a Poisson process and T
n
is the inter-arrival time
between the n th and (n 1) th events, then {T
n
, n 1} is a sequence
of i.i.d. random variables with exponential distribution, with parameter .
Proof
P(T
1
> t) = P(N
t
= 0) = e
t
T
1
exponential
Need to show that T
1
and T
2
are independent and T
2
is also exponential.
P(T
2
> t | T
1
(s , s + ]) =
P(T
2
> t, T
1
(s , s + ])
P(T
1
(s , s + ])
(3)
The event {T
2
> t, T
1
(s , s + ]} is a subset of the event described by
Fig. 2, i.e.
P(T
2
> t, T
1
(s, s+]) P(N
s
= 0
. .
, N
s+
N
s
= 1
. .
, N
s+t
N
s+
= 0
. .
)
no arrivals one arrival no arrivals
P(T
2
> t, T
1
(s , s + ]) P(T
1
(s , s + ]) P(N
s+t
N
s+
= 0)
P(T
2
> t, T
1
(s , s + ]) P(T
1
(s , s + ]) e
(t2 )
From (3)
P(T
2
> t | T
1
(s , s + ]) e
(t2 )
(4)
Zoran Nenadic POISSON PROCESS 6
s s s 0
t
+
no arrivals
one arrival
s+t s+t+
Figure 3: Event description
Similarly, the event described by Fig. 3 is a subset of the event {T
2
> t, T
1

(s , s + ]}, therefore
P(N
s
= 0, N
s+
N
s
= 1, N
s+t+
N
s+
= 0) P(T
2
> t, T
1
(s , s + ])
P(T
1
(s , s + ]) P(N
s+t+
N
s+
= 0) P(T
2
> t, T
1
(s , s + ])
P(T
1
(s , s + ]) e
t
P(T
2
> t, T
1
(s , s + ])
From (3)
P(T
2
> t | T
1
(s , s + ]) e
t
(5)
From (4) and (5), using squeeze theorem ( 0), it follows
P(T
2
> t | T
1
= s) = e
t
f
T
2
| T
1
=s
(t | s) = e
t
Therefore, T
2
is independent of T
1
, and T
2
is exponentially distributed ran-
dom variable.
Theorem
1. E[N
t
] = t
2. V ar[N
t
] = t
Proof Recall that G
t
(z) = E[z
Nt
], then
_
dG
t
(z)
dz
_
z=1
= E
_
N
t
z
Nt1

z=1
= E[N
t
]
E[N
t
] =
_
t e
t(z1)

z=1
= t
Zoran Nenadic POISSON PROCESS 7

0 T
M
t
Figure 4: Uniform bins
Likewise
_
d
2
G
t
(z)
dz
2
_
z=1
= E[N
t
(N
t
1)]
E[N
2
t
] =
_
( t)
2
e
t(z1)

z=1
+ E[N
t
] = ( t)
2
+ t
V ar[N
t
] = ( t)
2
+ t ( t)
2
= t
Theorem (Conditioning on the number of arrivals) Given that in the interval
(0, T] the number of arrivals is N
T
= n, the n arrival times are independent
and uniformly distributed on [0, T].
Proof Independence of arrival times t
1
, t
2
etc. directly follows from indepen-
dence of non-overlapping increments. In particular let t
1
and t
2
be arrival
times of rst and second event, then
P(t
1
(0, s], t
2
(s, t]) = P(N
s
= 1, N
t
N
s
= 1) =
= P(N
s
= 1) P(N
t
N
s
= 1|N
s
= 1) = P(t
1
(0, s]) P(t
2
(s, t])
Suppose that we know exactly one event happened in the interval (0, T], and
suppose the interval is partitioned into M segments of length t, as shown
in Fig. 1. Let p
i
be the probability of event happening in the ith bin, then

M
i=1
p
i
= 1. From the denition of Poisson process it follows that p
i
t,
say p
i
= C( t + o(t)). The constant C is determined from
M

i=1
C( t + o(t)) = 1 C =
1
Mt + M o(t)
=
1
T( +
o(t)
t
)
Let t
1
be a random variable corresponding to the time of arrival, then the
probability density function (pdf) of t
1
can be dened as
f
t
1
(t) = lim
t0
p
i
t
=
1
T
i = 1, 2, , M where t = i t.
Zoran Nenadic POISSON PROCESS 8
Therefore, t
1
is uniformly distributed on [0, T].
Let t
1
and t
2
be the arrival times of two events, and we know exactly two
events happened on (0, T]. Also assume that t
1
and t
2
represent mere labels
of events, not necessarily their order. Given that t
1
happened in j th bin,
the probability of t
2
occurring in any bin of size t is proportional to the size
of that bin, i.e. p
i
t, except for the j th bin, where p
j
o(t). By
rendering the bin size innitesimal, we notice that the probability p
i
remains
constant over all but one bin, the bin in which t
1
occurred, where p
j
= 0.
But this set is a set of measure zero, so the cumulative sum over p
i
again
gives rise to uniform distribution on (0, T].
Question What is the probability of observing n events at instances
1
,
2
,
,
n
on the interval [0, T]?
Since arrival times t
1
, t
2
, t
n
are continuous random variables, the answer
is 0. However, we can calculate the associated pdf as
f
t
1
t
2
tn
(
1
,
2
, ,
n
) =
= lim
dt0
P(t
1
(
1
,
1
+ dt], , t
n
(
n
,
n
+ dt], N
T
= n)
dt
n
where
P(t
1
(
1
,
1
+ dt], , t
n
(
n
,
n
+ dt], N
t
= n) =
= P(t
1
(
1
,
1
+ dt], , t
n
(
n
,
n
+ dt] | N
T
= n) P(N
T
= n)
=
_
dt
T
_
n
( T)
n
n!
e
T
=

n
dt
n
n!
e
T
f
t
1
t
2
tn
(
1
,
2
, ,
n
) =

n
n!
e
T
Question What is the power spectrum of Poisson process?
It does not make sense to talk about the power spectrum of Poisson process,
since it is not a stationary process. In particular the mean of Poisson
process is
E[N
t
] = t
Zoran Nenadic POISSON PROCESS 9
and its autocorrelation function is
R(t, s) E[N
t
N
s
]
R(t, s)
t>s
= E[(N
t
N
s
+ N
s
) N
s
] = E[(N
t
N
s
) N
s
+ N
2
s
]
= E[N
t
N
s
]E[N
s
] + E[N
2
s
] = (t s) s +
2
s
2
+ s =
2
t s + s
R(t, s)
t<s
= E[(N
s
N
t
+ N
t
) N
t
] = E[(N
s
N
t
) N
t
+ N
2
t
]
= E[N
s
N
t
]E[N
t
] + E[N
2
t
] = (s t) t +
2
t
2
+ t =
2
t s + t
Since R(t, s) = R(t s), we conclude that {N
t
, t 0} is not stationary
(in weak sense), therefore it does not make sense to talk about its power
spectrum. Let us dene the following stochastic process (Fig. 5)
S
t
=
dN
t
dt
=

i
(t t
i
) spike train (6)
The fundamental lemma says that if Y (t) = L{X(t)}, where L is a linear
N
t

S
t

Figure 5: Spike train
operator, then
E[Y (t)] = L{E[X(t)]}
Since dierentiation is a linear operator we have
E[S
t
] =
d(t)
dt
=
Zoran Nenadic POISSON PROCESS 10
Also, it can be shown using theory of linear operators that
R
SS
(t, s) =

t
_
R
NN
(t, s)
s
_
=
_

t
[
2
t + ] t > s

t
[
2
t] t < s
=

t
_

2
t + U(t s)
. .
_
=
2
+ (t s)
Heaviside function
Thus, S
t
is WWS
3
stochastic process, and it makes sense to dene the power
spectrum of such a process as a Fourier transform of its autocorrelation
function i.e.
P
S
() = F{R
SS
()} =
_

R
SS
() e
j
d = +
2
2()
Therefore, the spike train S
t
=

i
(t t
i
) of independent times t
i
behaves
almost as a white noise, since its power spectrum is at for all frequencies,
except for the spike at = 0. The process S
t
dened by (6) is a simple
version of what is in engineering literature known as a shot noise.
Denition (Inhomogeneous Poisson process) A Poisson process with a non-
constant rate = (t) is called inhomogeneous Poisson process. In this case
we have
1. non-overlapping increments are independent (the stationarity is lost
though).
2. P(N
t+t
N
t
= 1) = (t) t + o(t)
3. P(N
t+t
N
t
2) = o(t)
Theorem If {N
t
, t > 0} is a Poisson process with the rate (t), then N
t
is
a Poisson random variable with parameter =
_
t
0
() d i.e.
P(N
t
= k) =
(
_
t
0
() d)
k
k!
e

_
t
0
() d
(7)
3
Wide (weak) sense stationary. A stochastic process X(t) is WSS if E[X(t)] = const
and R
XX
(t, s) = R
XX
(t s)
Zoran Nenadic POISSON PROCESS 11
Proof The proof of this theorem is identical to that of homogeneous case
except that is replaced by (t). In particular, one can easily get
G
t
(z) = e
(z1)
_
t
0
() d
, (8)
from which (7) readily follows.
Theorem Let {N
t
, t > 0} be an inhomogeneous Poisson process with the
rate (t) and let t > s 0, then
P(N
t
N
s
= k) =
(
_
t
s
() d)
k
k!
e

_
t
s
() d
(9)
The application of this theorem stems from the fact that we cannot use
P(N
t
N
s
= k) = P(N
ts
= k), since the increments are no longer station-
ary.
Proof
G
t
(z) = E[z
Nt
] = E[z
NtNs+Ns
] = E[z
NtNs
] E[z
Ns
] = E[z
NtNs
] G
s
(z)
E[z
NtNs
] =
G
t
(z)
G
s
(z)
by (8)
=
e
(z1)
_
t
0
() d
e
(z1)
_
s
0
() d
= e
(z1)
_
t
s
() d
Thus, N
t
N
s
is a Poisson random variable with parameter =
_
t
s
() d,
and (9) easily follows.
Theorem
1. E[N
t
] =
_
t
0
() d
2. V ar[N
t
] =
_
t
0
() d
Proof Recall that
E[N
t
] =
_
dG
z
(t)
dz
_
z=1
and E[N
2
t
] =
_
d
2
G
z
(t)
dz
2
_
z=1
+ E[N
t
]
From (8) we have G
t
(z) = e
(z1)
_
t
0
() d
, and the two results follow after
immediate calculations.
Zoran Nenadic POISSON PROCESS 12
Theorem (Conditioning on the number of arrivals) Given that in the interval
(0, T] the number of arrivals is N
T
= n, the n arrival times are independently
distributed on [0, T] with the pdf (t)/
_
T
0
() d.
Proof The proof of this theorem is analogous to that of the homogeneous
case. The probability of a single event happening at any of M bins (Fig. 1)
is given by p
i
= C((i t) t +o(t)), where i is the bin index. Given that
exactly one event occurred in the interval (0, T], we have
M

i=1
p
i
= 1 C =
1

M
i=1
(it) t + T
o(t)
t
f
t
1
(t) = lim
t
p
i
t
=
(t)
_
T
0
() d
where t = i t.
The argument for independence of two or more arrival times is identical to
that of the homogeneous case.
Question What is the probability of observing n events at instances
1
,
2
,
,
n
on the interval [0, T]?
Since arrival times t
1
, t
2
, t
n
are continuous random variables, the answer
is 0. However, we can calculate the associated pdf as
f
t
1
t
2
tn
(
1
,
2
, ,
n
) =
= lim
dt0
P(t
1
(
1
,
1
+ dt], , t
n
(
n
,
n
+ dt], N
T
= n)
dt
n
where
P(t
1
(
1
,
1
+ dt], , t
n
(
n
,
n
+ dt], N
T
= n) =
= P(t
1
(
1
,
1
+ dt], , t
n
(
n
,
n
+ dt] | N
T
= n) P(N
T
= n)
=
_
n

i=1
_

i
+dt

i
() d
_
T
0
() d
_
(
_
T
0
() d)
n
n!
e

_
T
0
() d
dt0

_
n

i=1
(
i
)
_
dt
n
n!
e

_
T
0
() d
f
t
1
t
2
tn
(
1
,
2
, ,
n
) =

n
i=1
(
i
)
n!
e

_
T
0
() d
Zoran Nenadic POISSON PROCESS 13
0 0.2 0.4 0.6 0.8 1
1
2
3
4
5
6
7
8
9
10
Time (s)
R
a
s
t
e
r

N
u
m
b
e
r
0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4
0
0.17
0.34
0.52
Time (s)
R
e
l
a
t
i
v
e

F
r
e
q
u
e
n
c
y
Sample Mean: 0.032045
Standard Deviation: 0.032456
Sample Size: 29111
Figure 6: Realization of a point process using conditioning on the number of
arrivals. (Top) Ten dierent sample paths of the same point process shown
as raster plots. (Bottom) The histogram of inter-arrival times, showing the
exponential trend
Zoran Nenadic POISSON PROCESS 14
0 0.2 0.4 0.6 0.8 1
1
2
3
4
5
6
7
8
9
10
Time (s)
R
a
s
t
e
r

N
u
m
b
e
r
0 0.1 0.2 0.3 0.4
0
0.18
0.35
0.53
Time (s)
R
e
l
a
t
i
v
e

F
r
e
q
u
e
n
c
y
Sample Mean: 0.03275
Standard Deviation: 0.032424
Sample Size: 28502
Figure 7: Realization of a point process using method of innitesimal incre-
ments. (Top) Ten dierent sample paths of the same point process shown
as raster plots. (Bottom) The histogram of inter-arrival times, showing the
exponential trend
Zoran Nenadic POISSON PROCESS 15
0 0.2 0.4 0.6 0.8 1
1
2
3
4
5
6
7
8
9
10
Time (s)
R
a
s
t
e
r

N
u
m
b
e
r
0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4
0
0.17
0.34
0.51
Time (s)
R
e
l
a
t
i
v
e

F
r
e
q
u
e
n
c
y
Sample Mean: 0.031979
Standard Deviation: 0.031894
Sample Size: 29214
Figure 8: Realization of a point process using method of independent inter-
arrival times. (Top) Ten dierent sample paths of the same point process
shown as raster plots. (Bottom) The histogram of inter-arrival times, showing
the exponential trend

Das könnte Ihnen auch gefallen