Beruflich Dokumente
Kultur Dokumente
Poisson Processes
∆ 2∆
0 T 0 T
1
Fig. 15.1 PILLAI
It follows that (refer to Fig. 15.1)
" k arrivals occur in an −2 λ (2λ )
k
P =e , k = 0, 1, 2,", (15-3)
interval of duration 2∆" k!
since in that case
2∆
np1 = µT ⋅ = 2 µ∆ = 2λ . (15-4)
T
From (15-1)-(15-4), Poisson arrivals over an interval form a Poisson
random variable whose parameter depends on the duration
of that interval. Moreover because of the Bernoulli nature of the
underlying basic random arrivals, events over nonoverlapping
intervals are independent. We shall use these two key observations
to define a Poisson process formally. (Refer to Example 9-5, Text)
Definition: X(t) = n(0, t) represents a Poisson process if
(i) the number of arrivals n(t1, t2) in an interval (t1, t2) of length
t = t2 – t1 is a Poisson random variable with parameter λt.
Thus 2
PILLAI
(λt ) k
P{n(t1 , t 2 ) = k} = e −λt , k = 0, 1, 2, ", t = t 2 − t1 (15-5)
k!
and
(ii) If the intervals (t1, t2) and (t3, t4) are nonoverlapping, then the
random variables n(t1, t2) and n(t3, t4) are independent.
Since n(0, t) ~ P ( λt ), we have
E [ X (t )] = E [n (0, t )] = λt (15-6)
and
E [ X 2 (t )] = E [n 2 (0, t )] = λt + λ2 t 2 . (15-7)
+1 t1
t
−1
RXX ′ (t1 , t2 ) = = 2
∂ t2 λ t1 + λ t1 > t2
= λ 2 t1 + λ U (t1 − t2 ) (15-15)
and
∂ R XX ′ (t1 , t2 )
R X ′X ′ (t1 , t2 ) = = λ2 + λ δ (t1 − t2 ). (15-16)
∂ t1 6
PILLAI
From (15-14) and (15-16) it follows that X ′(t ) is a wide sense
stationary process. Thus nonstationary inputs to linear systems can
lead to wide sense stationary outputs, an interesting observation.
• Sum of Poisson Processes:
If X1(t) and X2(t) represent two independent Poisson processes,
then their sum X1(t) + X2(t) is also a Poisson process with
parameter ( λ1 + λ2 )t. (Follows from (6-86), Text and the definition
of the Poisson process in (i) and (ii)).
• Random selection of Poisson Points:
Let t1 , t2 ,", ti , " represent random arrival points associated
with a Poisson process X(t) with parameter λt ,
and associated with
each arrival point, " "
t
define an independent 1 2
t t i
t
we claim that both Y(t) and Z(t) are independent Poisson processes
with parameters λpt and λqt respectively.
Proof:
∞
Y (t ) = ∑ P{Y (t ) = k | X (t ) = n}P{ X (t ) = n )}. (15-19)
n =k n
But given X(t) = n, we have Y (t ) = ∑ N i ~ B( n, p ) so that
i =1
P{Y (t ) = k | X (t ) = n} = () n
k
p k q n −k , 0 ≤ k ≤ n, (15-20)
and
( λ t ) n
P{ X (t ) = n} = e − λ t . (15-21)
n!
Substituting (15-20)-(15-21) into (15-19) we get 8
PILLAI
∞
p k e − λt ∞
P{Y (t ) = k } = e − λ t ∑ ( n −nk!)! k ! p k q n − k ( λ t ) ∑ ( n − k )!
( qλ t ) n − k
n
( λt )
n! = k
n=k k!
n=k
e qλt
− (1− q ) λ t
e ( λ pt ) k
= ( λ pt ) k = e − λ pt , k = 0, 1, 2, "
k! k!
~ P ( λ pt ). (15-22)
More generally,
P{Y (t ) = k , Z (t ) = m} = P{Y (t ) = k , X (t ) − Y (t ) = m}
= P{Y (t ) = k , X (t ) = k + m}
= P{Y (t ) = k | X (t ) = k + m}P{ X (t ) = k + m}
(λ t )k + m ( λ pt ) n
( λ qt ) n
= ( k +km ) p k q m ⋅ e − λ t = e − λ pt e − λ qt
( k + m )!
k! m !
P (Y ( t ) = k ) P( Z (t )=m)
Notice that Y(t) and Z(t) are generated as a result of random Bernoulli
selections from the original Poisson process X(t) (Fig. 15.5),
where each arrival gets tossed
over to either Y(t) with Y ( t ) ~ P ( λpt )
probability p or to Z(t) with t
probability q. Each such
p p p p
sub-arrival stream is also
a Poisson process. Thus X ( t ) ~ P ( λt )
random selection of Poisson t
points preserve the Poisson
nature of the resulting q q
processes. However, as we Z ( t ) ~ P ( λqt )
shall see deterministic t
selection from a Poisson
process destroys the Poisson Fig. 15.5
property for the resulting processes. 10
PILLAI
Inter-arrival Distribution for Poisson Processes
Let τ 1 denote the time interval (delay) 2nd nth
to the first arrival from any fixed point Ist arrival arrival
and hence
dFtn ( x ) λ ( λ x ) k −1 − λ x n −1 λ ( λ x ) k − λ x
n −1
f tn ( x ) = = −∑ e +∑ e
dx k =1 ( k − 1)! k =0 k!
λ n x n −1 − λ x
= e , x≥0 (15-27)
( n − 1)!
which represents a gamma density function. i.e., the waiting time to
the nth Poisson arrival instant has a gamma distribution.
Moreover n
tn = ∑ τ i 12
i =1 PILLAI
where τ i is the random inter-arrival duration between the (i – 1)th
and ith events. Notice that τ i s are independent, identically distributed
random variables. Hence using their characteristic functions,
it follows that all inter-arrival durations of a Poisson process are
independent exponential random variables with common parameter λ .
i.e.,
fτ i (t ) = λ e − λ t , t ≥ 0. (15-28)
14
PILLAI
Axiomatic Development of Poisson Processes:
The defining properties of a Poisson process are that in any “small”
interval ∆t , one event can occur with probability that is proportional
to ∆t. Further, the probability that two or more events occur in that
interval is proportional to ο (∆t ), (higher powers of ∆t ), and events
over nonoverlapping intervals are independent of each other. This
gives rise to the following axioms.
Axioms:
(i) P{n(t , t + ∆t ) = 1} = λ∆t + ο (∆t )
(ii) P{n(t , t + ∆t ) = 0} = 1 − λ∆t + ο (∆t )
(15-29)
(iii) P{n(t , t + ∆t ) ≥ 2} = ο (∆t )
and
n (t , t + ∆t ) is independen t of n ( 0, t )
(iv)
Notice that axiom (iii) specifies that the events occur singly, and axiom
(iv) specifies the randomness of the entire series. Axiom(ii) follows
from (i) and (iii) together with the axiom of total probability. 15
PILLAI
We shall use these axiom to rederive (15-25) first:
Let t0 be any fixed point (see Fig. 15.6) and let t0 + τ 1 represent the
time of the first arrival after t0 . Notice that the random variable τ 1
is independent of the occurrences prior to the instant t0 (Axiom (iv)).
With Fτ1 (t ) = P{τ 1 ≤ t} representing the distribution function of τ 1 ,
as in (15-24) define Q (t ) =∆ 1 − Fτ1 (t ) = P{τ 1 > t}. Then for ∆t > 0
Q (t + ∆t ) = P{τ 1 > t + ∆t}
= P{τ 1 > t , and no event occurs in (t0 + t , t0 + t + ∆t )}
= P{τ 1 > t , n(t0 + t , t0 + t + ∆t ) = 0}
= P{n(t0 + t , t0 + t + ∆t ) = 0 | τ 1 > t}P{τ 1 > t}.
From axiom (iv), the conditional probability in the above expression
is not affected by the event {τ 1 > t} which refers to {n(t0, t0 + t) = 0},
i.e., to events before t0 + t, and hence the unconditional probability
in axiom (ii) can be used there. Thus
Q(t + ∆t ) = [1 − λ∆t + ο (∆t )]Q(t )
or
Q ( t +∆t ) −Q ( t )
lim ∆t = Q ′( t ) = − λ Q ( t ) ⇒ Q ( t ) = ce − λt
. 16
∆t → 0 PILLAI
But c = Q(0) = P{τ 1 > 0} = 1 so that
Q (t ) = 1 − Fτ1 (t ) = e − λ t
or
Fτ1 (t ) = 1 − e − λ t , t≥0
which gives
dFτ1 (t )
fτ1 (t ) = = λ e − λt , t≥0 (15-30)
dt
to be the p.d.f of τ 1 as in (15-25).
or with
pk (t + ∆t ) − pk (t )
lim = pk′ (t )
∆t →0 ∆t
we get the differential equation
pk′ (t ) = −λpk (t ) − λpk −1 (t ), k = 0, 1, 2, "
whose solution gives (15-5). Here p−1 (t ) ≡ 0. [Solution to the above
differential equation is worked out in (16-36)-(16-41), Text].
This is completes the axiomatic development for Poisson processes.
19
PILLAI
Poisson Departures between Exponential Inter-arrivals
Let X (t ) ~ P(λt ) and Y (t ) ~ P ( µt ) represent two independent
Poisson processes called arrival and departure processes.
X (t ) → τ τ τ τ τ
1 2 3 i k
t
Y (t ) →
t
1
t 2
t i
t i +1
Fig. 15.7
Let Z represent the random interval between any two successive
arrivals of X(t). From (15-28), Z has an exponential distribution with
parameter λ . Let N represent the number of “departures” of Y(t)
between any two successive arrivals of X(t). Then from the Poisson
nature of the departures we have
( µ t )k
P{N = k | Z = t} = e − µ t .
k!
Thus
20
PILLAI
∞
P{N = k } = ∫ 0 P{N = k | Z = t} f Z (t )dt
∞ − µ t ( µ t )k
= ∫0 e k! λ e − λ t dt
λ ∞
= k! ∫0 ( µ t ) k e − ( λ + µ ) t dt
k
λ µ 1 ∞ k −x
= λ + µ λ + µ k !
0
∫
x e dx
k!
k
= λ µ , k = 0, 1, 2, " (15-31)
λ +µ λ +µ
i.e., the random variable N has a geometric distribution. Thus if
customers come in and get out according to two independent
Poisson processes at a counter, then the number of arrivals between
any two departures has a geometric distribution. Similarly the
number of departures between any two arrivals also represents
another geometric distribution. 21
PILLAI
Stopping Times, Coupon Collecting, and Birthday Problems
Suppose a cereal manufacturer inserts a sample of one type
of coupon randomly into each cereal box. Suppose there are n such
distinct types of coupons. One interesting question is that how many
boxes of cereal should one buy on the average in order to collect
at least one coupon of each kind?
t
between any two arrivals of t k1
t 21 t 11 t i1 t 12 t 31
" t n1
X i (t ), i = 1, 2, " , n, whereas 1
λ
1 / µ represents the average inter-arrival
time for the combined sum process Fig. 15.8
X(t) in (15-32).
Define the stopping time T to be that random time instant by which
at least one arrival of X 1 (t ), X 2 (t ),", X n (t ) has occurred.
Clearly, we have
T = max (t11 , t 21 , " , ti1 ," , t n1 ). (15-34)
But from (15-25), ti1 , i = 1, 2, " , n are independent exponential
random variables with common parameter λ . This gives
FT (t ) = P{T ≤ t} = P{max (t11 , t21 ,", tn1 ) ≤ t}
= P{t11 ≤ t , t21 ≤ t , ", tn1 ≤ t}
23
= P{t11 ≤ t}P{t21 ≤ t}" P{tn1 ≤ t} = [ Fti (t )] . n
PILLAI
Thus
FT (t ) = (1 − e − λ t ) n , t ≥0 (15-35)
represents the probability distribution function of the stopping time
random variable T in (15-34). To compute its mean, we can make
use of Eqs. (5-52)-(5-53) Text, that is valid for nonnegative random
variables. From (15-35) we get
P(T > t ) = 1 − FT (t ) = 1 − (1 − e − λ t ) n , t ≥0
so that
∞ ∞
E{T } = ∫ 0 P(T > t )dt = ∫ 0 {1 − (1 − e − λ t ) n }dt. (15-36)
Let 1 − e − λ t = x, so that λ e − λ t dt = dx, or dt = λ (1dx− x ) ,
and
1 1 dx 1 11− x n
E{T } = λ ∫ 0 (1 − x )
n
= λ ∫ 0 1− dx
1− x x
k 1
1 1 1
n
x
= λ ∫ 0 (1 + x + x
2
+"+ x n −1
)dx = ∑
λ
k =1 k
24
0
PILLAI
1 1 1 1 n 1 1 1
E{T } =
1 + + + " + = 1 + + + " +
λ 2 3 n µ 2 3 n
1
n(ln n + γ ) (15-37)
µ
where γ 0.5772157" is the Euler’s constant1.
Let the random variable N denote the total number of all arrivals
up to the stopping time T, and Yk , k = 1, 2, ", N the inter-arrival
random variables associated with the sum process X(t) (see Fig 15.8).
1 1 ∞ ∞
1
and u n ≤ ∫ x dx
0 n2
≤ ∫ 1 dx
0 n2
=
n2
so that ∑ un < ∑ 1
2
π2
= 6 < ∞ . Thus the series {un} converges
n =1 n =1 n
n n
to some number γ > 0. From (1) we obtain also ∑ uk = ∑ k1 − ln ( n + 1) so that
k =1 k =1
1 1 1
lim{1 + 2 + 3 + " + n − ln n} = lim
n →∞ n →∞
{∑ n
u
k =1 k
}
+ ln nn+1 = ∑ k =1 uk
∞
= γ = 0.5772157 " . 25
PILLAI
Then we obtain the key relation
N
T = ∑ Yi (15-38)
i =1
so that
n n
E{T | N = n} = E{∑ Yi | N = n} = E{∑ Yi } = nE{Yi } (15-39)
i =1 i =1
∞
= ∫ 0 {1 − (1 − e − t / 2 n )n }2 dt.
Poisson Quotas
Let n
X (t ) = ∑ X i (t ) ~ P ( µt ) (15-50)
i =1
30
where Xi(t) are independent, identically distributed Poisson PILLAI
processes with common parameter λi t so that µ = λ1 + λ2 + " + λn .
Suppose integers m1 , m2 ," , mn represent the preassigned number of
arrivals (quotas) required for processes X 1 (t ), X 2 (t )," , X n (t ) in
the sense that when mi arrivals of the process Xi(t) have occurred,
the process Xi(t) satisfies its “quota” requirement.
The stopping time T in this case is that random time
instant at which any r processes have met their quota requirement
where r ≤ n is given. The problem is to determine the probability
density function of the stopping time random variable T, and
determine the mean and variance of the total number of random
arrivals N up to the stopping time T.
Solution: As before let ti1 , ti 2 , " represent the first, second, "
arrivals of the ith process Xi(t), and define
Yij = tij − ti , j −1 (15-51)
Notice that the inter-arrival times Yij and independent, exponential
random variables with parameter λi , and hence
mi
ti ,mi = ∑ Yij ~ Gamma(mi , λi ) 31
j =1 PILLAI
Define Ti to the stopping time for the ith process; i.e., the occurrence
of the mith arrival equals Ti . Thus
Ti = ti ,mi ~ Gamma(mi , λi ), i = 1, 2, " , n (15-52)
or
t mi −1 mi −λit
f Ti (t ) = λi e , t ≥ 0. (15-53)
(mi − 1)!
Since the n processes in (15-49) are independent, the associated
stopping times Ti , i = 1, 2, " , n defined in (15-53) are also
independent random variables, a key observation.
Given independent gamma random variables T1 , T2 , ", Tn ,
in (15-52)-(15-53) we form their order statistics: This gives
T(1) < T( 2 ) < " < T( r ) < " < T( n ) . (15-54)
Note that the two extremes in (15-54) represent
T(1) = min (T1 , T2 , " , Tn ) (15-55)
and 32
T( n ) = max (T1 , T2 , ", Tn ). (15-56) PILLAI
The desired stopping time T when r processes have satisfied their
quota requirement is given by the rth order statistics T(r). Thus
T = T(r ) (15-57)
where T(r) is as in (15-54). We can use (7-14), Text to compute the
probability density function of T. From there, the probability density
function of the stopping time random variable T in (15-57) is given by
n!
f T (t ) = FTki −1 (t )[1 − FT i (t )]n − k f Ti (t ) (15-58)
(r − 1)!(n − r )!
where FTi (t ) is the distribution of the i.i.d random variables Ti and
fTi (t ) their density function given in (15-53). Integrating (15-53) by
parts as in (4-37)-(4-38), Text, we obtain
(λi t ) k −λit
mi −1
FTi (t ) = 1 − ∑ e , t ≥ 0. (15-59)
k =0 k!
Together with (15-53) and (15-59), Eq. (15-58) completely specifies
33
the density function of the stopping time random variable T, PILLAI
where r types of arrival quota requirements have been satisfied.
If N represents the total number of all random arrivals up to T,
then arguing as in (15-38)-(15-40) we get
N
T = ∑ Yi (15-60)
i =1
where Yi are the inter-arrival intervals for the process X(t), and hence
E{T } = E{N }E{Yi } = µ1 E{N } (15-61)
with normalized mean value µ (= 1) for the sum process X(t), we get
E{N } = E{T }. (15-62)
To relate the higher order moments of N and T we can use their
characteristic functions. From (15-60)
N n
jω ∑ Yi jω ∑ Yi
E{e jωT } = E{e i =1
} = E{E [e i =1 | N = n ]}
[ E {e jωYi }]n
{( }
∞
)
N
E{e jω T
} = ∑ [ E{e jωYi
}] P( N = n ) =E
n 1
1− jω = E{(1 − jω ) − N }
n =0
or
E{T k } = E{N ( N + 1) " ( N + k − 1)}, (15-64)
a key identity. From (15-62) and (15-64), we get
var{N } = var{T } − E{T }. (15-65) 35
PILLAI
As an application of the Poisson quota problem, we can reexamine
the birthday pairing problem discussed in Example 2-20, Text.
“Birthday Pairing” as Poisson Processes:
In the birthday pairing case (refer to Example 2-20, Text), we
may assume that n = 365 possible birthdays in an year correspond to
n independent identically distributed Poisson processes each with
parameter 1/n, and in that context each individual is an “arrival”
corresponding to his/her particular “birth-day process”. It follows that
the birthday pairing problem (i.e., two people have the same birth date)
corresponds to the first occurrence of the 2nd return for any one of
the 365 processes. Hence
m1 = m2 = " mn = 2, r = 1 (15-66)
so that from (15-52), for each process
Ti ~ Gamma (2, 1/n). (15-67)
n
so that
( )t n
3
t3
−t
2
− 2t n t 2
− 2t n
1+ e = e e ≈ e 1 + 2 3 n2
(15-71)
n 3n
and substituting (15-71) into (15-70) we get the mean number of
people in a crowd for a two-person birthday coincidence to be
∞ ∞ 3 −t2 / 2n
E{ N } ≈ ∫ 0 e −t2 / 2n
dt + 1 2 ∫0 t e dt
3n
xe − x dx = π2n + 2
2 ∞
= 1 2π n + 2n2 ∫0
2 3n 3
= 24.612. (15-72)
n
= ∫0
∞
t ∞
2t 1 + e dt ≈ 2 ∫ 0 t e −t2 / 2n
dt + 22
∞ 4 −t2 / 2n
∫0 t
−t
e dt
n 3n
∞
= 2n ∫ 0 e − x dx + 22 π 3 (2n ) 2 2n = 2n + 2π n (15-73)
3n 8
which gives (use (15-65))
σ T ≈ 13.12, σ N ≈ 12.146. (15-74)
The high value for the standard deviations indicate that in reality the
crowd size could vary considerably around the mean value.
Unlike Example 2-20 in Text, the method developed here
can be used to derive the distribution and average value for 39
PILLAI
a variety of “birthday coincidence” problems.
Three person birthday-coincidence:
For example, if we are interested in the average crowd size where
three people have the same birthday, then arguing as above, we
obtain
m1 = m2 = " = mn = 3, r = 1, (15-75)
so that
Ti ~ Gamma (3, 1 / n ) (15-76)
and T is as in (15-68), which gives
( )
2 n
FT (t ) = 1 − [1 − FTi (t )]n = 1 − 1 + t + t 2 e−t , t≥0 (15-77)
n 2n
(use (15 - 59) with mi = 3, λi = 1 / n ) to be the distribution of the
stopping time in this case. As before, the average crowd size for three-
person birthday coincidence equals
( )
2 n
E{N } = E{T } = ∫ 0 P (T > t )dt = ∫ 0 1 + t + t 2 e − t dt.
∞ ∞
40
n 2n PILLAI
By Taylor series expansion
2 3
ln1 + t + t 2 = t + t 2 − 1 t + t 2 + 1 t + t 2
2 2 2 2
n 2n n 2n 2 n 2n 3 n 2n
3
t
≈ − 3 t
n 6n
so that
∞
E{ N } = ∫ 0 e − t3 / 6 n2 61/ 3 n2 / 3 ∞ ( 3 ) −x
1 −1
dt = 3 ∫0 x e dx
= 61/ 3 Γ(4 / 3) n 2 / 3 ≈ 82.85. (15-78)
Thus for a three people birthday coincidence the average crowd size
should be around 82 (which corresponds to 0.44 probability).
Notice that other generalizations such as “two distinct birthdays
to have a pair of coincidence in each case” (mi =2, r = 2) can be easily
worked in the same manner.
We conclude this discussion with the other extreme case,
where the crowd size needs to be determined so that “all days
41
in the year are birthdays” among the persons in a crowd. PILLAI
All days are birthdays:
Once again from the above analysis in this case we have
m1 = m2 = " = mn = 1, r = n = 365 (15-79)
so that the stopping time statistics T satisfies
T = max (T1 , T2 , " , Tn ), (15-80)
where Ti are independent exponential random variables with common
parameter λ = 1n . This situation is similar to the coupon collecting
problem discussed in (15-32)-(15-34) and from (15-35), the
distribution function of T in (15-80) is given by
FT (t ) = (1 − e − t / n ) n , t≥0 (15-81)
and the mean value for T and N are given by (see (15-37)-(15-41))
E{N } = E{T } ≈ n(ln n + γ ) = 2,364.14. (15-82)
Thus for “everyday to be a birthday” for someone in a crowd, 42
PILLAI
the average crowd size should be 2,364, in which case there is 0.57
probability that the event actually happens.
43
PILLAI
Bulk Arrivals and Compound Poisson Processes
47
PILLAI