Sie sind auf Seite 1von 5

HT 2004

Probability Generating Functions

It is important to realise that you cannot have intuition about p.g.f.s because they do not correspond to anything which is directly observable. A p.g.f. is nothing more than a mathematicians trick. You should think of it in terms of the denition. The p.g.f. of a discrete random variable X is dened by gX (s) = E sX .

Why bother with p.g.f.s?

They make calculations of expectations and of some probabilities very easy. The distribution of a random variable is easy to obtain from its p.g.f. They are easy to calculate and can almost always be found by using one of 3 standard tricks. They make sums of independent random variables easy to handle. Using p.g.f.s 1. Calculating E (X) gX (s) = E sX 0 gX (s) = E XsX1 , 0 so gX (1) = E (X) . 2. Calculating V (X) 00 gX (s) = E X (X 1) sX2 , 00 so gX (1) = E X 2 E (X) 00 0 and gX (1) + gX (1) = E X 2

Thus

3. Calculating P (X = 0) Since

00 0 0 V (X) = E X 2 E (X)2 = gX (1) + gX (1) gX (1)2 .

gX (0) = P (X = 0) .

X x s P (X = x) = P (X = 0) + sP (X = 1) + , gX (s) = E sX = x=0

Obviously other probabilities may be calculated by repeated dierentiation and choosing s = 0. 0 e.g. gX (0) = P (X = 1).

4. Calculating P (X is odd) and P (X is even) gX (1) = 1 = gX (1) = so 1 + gX (1) = 2 1 gX (1) = 2 Probabilities from p.g.f.s It is straightforward to obtain any or all probabilities from a p.g.f. Example What is the probability mass function whose p.g.f. is g(s) = 1 ? 2s
X X x=0 X x=0

P (X = x) ,

(1)x P (X = x) ,

P (X = 2m) , P (X = 2m + 1)

m=0 X m=0

Thus the p.m.f. is

X k+1 1 s 1 1 s s 2 1 1 gX (s) = = 1 1+ + = + ... = sk . 2s 2 2 2 2 2 2


k=0

Example If the p.g.f. of X is

x+1 1 p(x) = , 2

x = 0, 1, 2, . . .

gX (s) = what is the P (X = 3)?

2+s , (2 s2 ) (4 s)

1 s 1 (2 + s) s2 1 gX (s) = 1 8 2 4 2 4 1 s s s s s2 s3 = + 1+ + +... 1 + + + +... 4 8 2 4 4 16 64 P (X = 3) = coecient of s3 1 1 1 1 1 = + + + + 8 2 4 2 4 4 64 8 2 8 16

Calculating p.g.f.s Trick 1

Use the fact that a probability mass function sums to 1 to motivate a re-arrangement. 2

Example

(deliberately chosen to be a hard one) r+x1 r x p(x) = p q , p + q = 1, x r+x1 = (1 q)r q x , x = 0, 1, . . . , . x


X r + x 1 X r + x 1 r x x g(s) = p q s = (1 q)r (qs)x x x x=0 x=0 (1 q)r X r + x 1 1q r r x (1 qs) (qs) = = . (1 qs)r x=0 1 qs x

Example

(deliberately chosen to be a very hard one you have never seen) p(x) =
X

x e , x! (1 e )

x = 1, 2, . . .

X (s)x e x e g(s) = sx = x! (1 e ) x! (1 e ) x=1 x=1 = =

e (es 1) . (1 e )

e (1 es ) X (s)x es (1 e ) es x! (1 es ) x=1

Trick 2 Use the partition theorem for expectation. Example A game consists of a number of independent turns at each of which three, and only three, mutually exclusive events can occur: A the game terminates without addition to the score; B the game continues without addition to the score; C the game continues with the addition of one point to the score. The events are not equiprobable and the average score per game is m. Show that the probability of mx . scoring x points in one game is (1 + m)x+1 Suppose the respective probabilities of A, B, C are p, q, r, and condition on the rst turn. E sX = E sX | A p + E sX | B q + E sX | C r E sX | A = 1, E sX | B = E sX , E sX | C = E sX+1 = sE sX . gX (s) = p + gX (s)q + sgX (s)r p rs 1 rs k p p X = 1 gX (s) = = . 1 q rs 1q 1q 1q 1q
k=0

Thus

P (X = x) = The average score per game is

p 1q

r 1q

x = 0, 1, . . .

pr = E(X) = 2 (1 q rs) s=1 pr r = 2 = p = m. (1 q r)


0 gX (1)

[Note that p + q + r = 1] x x p p r r P (X = x) = = 1q 1q p+r p+r x m 1 mx = = . 1+m 1+m (1 + m)x+1 Trick 3 Use the sum of a sequence of simpler independent random variables. Then, if
n X i=1 n Y i=1

X=

Yi ,

gX (s) =

gYi (s)

which is (gY (s))n if the Yi are identically distributed. Example Derive the p.g.f of X B (n, p) . gY (s) = qs0 + ps1 = q + ps. gX (s) = (q + ps)n .

Let Y B(1, p), so that Then X = Pn


i=1 Yi

and

Example Consider a sequence of success-failure trials and let the probability of a success for an individual trial be p. Find the probability mass function for the number of failures X before the rth success occurs. Let Y be the number of failures which precede the 1st success. Then P (Y = k) = qk p = (1 q) q k , and gY (s) = = Now X=
X k=0

k = 0, 1, . . .

(1 q) q k sk =

(1 q) p = . (1 qs) 1 qs

(1 q) X (1 qs) (qs)k (1 qs)


k=0

n X i=1

Yi

so n p gX (s) = [gY (s)] = 1 qs n (n + 1) 2 2 n n = p 1 + qs + q s + ... 1! 2! X (n + k 1)! X n + k 1 = pn q k sk = pn q k sk . k! (n 1)! k


n

k=0

k=0

Thus

n+x1 n x P (X = x) = p q , x

x = 0, 1, . . .

Das könnte Ihnen auch gefallen