Sie sind auf Seite 1von 3

1

Moment generating functions - supplement to chap 1


MX (t) = E[etX ] (1)

The moment generating function (mgf) of a random variable X is

For most random variables this will exist at least for t in some interval containing the origin. The mgf is a computational tool. By taking derivatives and evaluating them at t = 0 you can compute moments: M (0) = E[X], M (0) = E[X 2 ], M (k) (0) = E[X k ] (2)

If Y = aX + b for constants a and b, then MY (t) = ebt MX (at) If X1 , X2 , Xn are independent and Y = X1 + Xn , then MY (t) = MX1 (t)MX2 (t) MXn (t) (4) (3)

1.1

Discrete distributions

Bernouilli distribution: This is a random variable X that only equals 0 and 1. The parameter p is P (X = 1). E[X] = p, V ar(X) = p(1 p), M (t) = (1 p) + pet (5)

Binomial distribution: Flip a coin n times, X is the number of heads, p is the probability of heads. f (x|n, p) = E[X] = np, n x p (1 p)nx , x x = 0, 1, 2, , n M (t) = [(1 p) + pet ]n (6) (7)

V ar(X) = np(1 p),

Note that the binomial random variable is the sum of n independent Bernoulli random variables with the same p. 1

Poisson: For > 0, f (x|) = E[X] = , e x , x! x = 0, 1, 2, M (t) = exp((et 1)) (8) (9)

V ar(X) = ,

Geometric: Flip a coin until we get heads for the rst time. X is the number of tails we get before this rst heads. f (x|p) = p(1 p)x , E[X] = 1p , p V ar(X) = x = 0, 1, 2, M (t) = p 1 (1 p)et (10) (11)

1p , p2

Warning: some people dene X to be the total number of ips including the one that gave you the rst head. Negative binomial: Flip a coin until we get heads for the kth time. X is the number of ips including the ip on which the kth head happened. k(1 p) E[X] = , p k(1 p) V ar(X) = , p2 M (t) = p 1 (1 p)et
k

(12)

Warning: some people dene X to be the total number of ips including the ones that gave you the rst k heads.

1.2

Continuous distributions
1 f (x|, ) = exp 2 E[X] = , (x )2 2 2

Normal: For > 0 and any , , < x < (13) (14)

V ar(X) = 2 ,

1 M (t) = exp(t + 2 t2 ) 2

Exponential: For > 0, f (x|) = ex , E[X] = 1 , V ar(X) = 2 1 , 2 x0 M (t) = t (15) (16)

Gamma: For , > 0, f (x|, ) = 1 x x e , () x0 (17)

If is an integer, () = ( 1)!. E[X] = , V ar(X) = 2 , M (t) = t

(18)

This shows that the sum of k independent exponential random variables with parameter has a gamma distribution with = k and = . Warning: Some people parameterize the gamma distribution dierently. Their is my 1/. Chi-squared or 2 : Let Z1 , Z2 , , Zn be independent standard normal RVs. Let n X=
i=1

Zi2

(19)

Then X has the chi-squared distribution with n degrees of freedom. It can be shown that this is the gamma distribution with = n/2 and = 1/2. So the pdf is 1 f (x|n) = n/2 xn/21 ex/2 , x 0 (20) 2 (n/2) 1 E[X] = n, V ar(X) = 2n, M (t) = (21) 1 2t Note that the sum of independent chi-squared is again with chi-squared with the number of degree of freedom adding. Students t: Let U and V be independent random variables. U has a standard normal distribution, and V has a chi-square distribution with n degrees of freedom. Let U T = (22) V /n The distribution of T is called Students t distribution (or just the t distribution) with n degrees of freedom. The pdf is n+1 2 f (x|n) = 1/2 (n) The mgf is not dened. 3
n 2 n/2

x2 1+ n

(n+1)/2

< x <

(23)

Das könnte Ihnen auch gefallen