Beruflich Dokumente
Kultur Dokumente
Dave Goldsman
Georgia Institute of Technology, Atlanta, GA, USA
8/19/15
1 / 59
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
2 / 59
Preliminaries
Preliminaries
Will assume that you know about sample spaces, events, and the
definition of probability.
Definition: P (A|B) P (A B)/P (B) is the conditional
probability of A given B.
Example: Toss a fair die. Let A = {1, 2, 3} and B = {3, 4, 5, 6}.
Then
1/6
P (A B)
=
= 1/4.
P (A|B) =
P (B)
4/6
3 / 59
Preliminaries
P (B) = 1/6,
and
4 / 59
Preliminaries
1/36 if x = 2
2/36 if x = 3
..
P (X = x) =
.
1/36 if x = 12
0
otherwise
5 / 59
Preliminaries
1/4
f (x) =
1/2
otherwise
Examples: Here are some well-known discrete RVs that you may
know: Bernoulli(p), Binomial(n, p), Geometric(p), Negative
Binomial, Poisson(), etc.
6 / 59
Preliminaries
f (x) =
0 otherwise
Preliminaries
1/4
F (x) =
3/4
if 1 x < 2
if x 2
8 / 59
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
9 / 59
0.25 if x = 2
0.10 if x = 3
P (X = x) =
0.65 if x = 4.2
0
otherwise
Cant use a die toss to simulate this random variable. Instead, use
whats called the inverse transform method.
x
f (x)
2
3
0.25
0.10
4.2
0.65
P (X x)
Unif(0,1)s
0.25
0.35
[0.00, 0.25]
(0.25, 0.35]
1.00
(0.35, 1.00)
1
n(1 U ) Exp().
12 / 59
i = 1, 2, . . . .
Some Exercises: In the following, Ill assume that you can use
Excel (or whatever) to simulate independent Unif(0,1) RVs. (Well
review independence in a little while.)
1
Great Expectations
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
16 / 59
Great Expectations
Great Expectations
Definition: The expected value (or mean) of a RV X is
( P
Z
if X is discrete
x xf (x)
x dF (x).
E[X]
=
R
R
R xf (x) dx if X is continuous
Example: Suppose that X Bernoulli(p). Then
(
1 with prob. p
X =
0 with prob. 1 p (= q)
P
and we have E[X] = x xf (x) = p.
17 / 59
Great Expectations
Then E[X 3 ] =
xx
f (x)
0.3
0.6
0.1
3 f (x)
18 / 59
Great Expectations
Great Expectations
e(t)x dx =
if > t.
t
Great Expectations
Further,
for > t. So
= 1/.
MX (t)
E[X] =
=
dt
( t)2 t=0
t=0
d2
2
E[X ] =
= 2/2 .
MX (t)
=
3
dt2
(
t)
t=0
t=0
2
Thus,
1
2
= 1/2 .
2 2
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
22 / 59
f (x)
1/4
1/2
1/4
x3
y=
= P (X 2 y)
= P ( y X y)
Z y
|x| dx = y, 0 < y < 1.
=
= P (X F 1 (y))
= F (F 1 (y)) = y,
25 / 59
1
n(1 U ) Exp().
26 / 59
1
[n(1 U )]1/ .
Now pick your favorite and , and use this result to generate values
of X. In fact, make a histogram of your X values. Are there any
interesting values of and you couldve chosen?
27 / 59
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
28 / 59
for all x, y.
29 / 59
fX (x) = P (X = x) =
f (x, y).
f (x, y).
Example: The following table gives the joint pmf f (x, y), along
with the accompanying marginals.
f (x, y)
X=2
X=3
X=4
fY (y)
Y =4
0.3
0.2
0.1
0.6
Y =6
0.1
0.2
0.1
0.4
fX (x)
0.4
0.4
0.2
30 / 59
31 / 59
f (x, y)
=
fX (x)
21 2
4 x y
21 2
4 x y
21 2
4
8 x (1 x )
for x2 y 1. Then
=
2y
, x2 y 1.
1 x4
f (x, y)
fX (x)fY (y)
=
.
fX (x)
fX (x)
33 / 59
yf (y|x) dy =
R
21 2
4 x y,
1
x2
if x2 y 1. Then
2y 2
2 1 x6
dy
=
.
1 x4
3 1 x4
34 / 59
Z Z
R
yf (y|x)fX (x) dx dy
R
yf (y|x) dy fX (x) dx
Z Z
f (x, y) dx dy
R
R
35 / 59
2
2
Old Example: Suppose f (x, y) = 21
4 x y, if x y 1. Note that,
by previous examples, we know fX (x), fY (y), and E[Y |x].
1
0
7
7 7/2
y dy = .
2
9
1
2 1 x6
3 1 x4
21 2
7
4
x (1 x ) dx = .
8
9
36 / 59
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
37 / 59
38 / 59
Example:
If X1 , . . . , Xn f (x) and the sample mean
n Pn Xi /n, then E[X
n ] = E[Xi ] and Var(X
n ) = Var(Xi )/n.
X
i=1
Thus, the variance decreases as n increases.
But not all RVs are independent...
39 / 59
40 / 59
Cov(X, Y )
.
Var(X)Var(Y )
Theorem: 1 1.
41 / 59
X=2
X=3
X=4
fY (y)
Y = 40
0.00
0.20
0.10
0.3
Y = 50
0.15
0.10
0.05
0.3
Y = 60
0.30
0.00
0.10
0.4
fX (x)
0.45
0.30
0.25
and
=
E[XY ] E[X]E[Y ]
p
= 0.415.
Var(X)Var(Y )
42 / 59
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
43 / 59
X Bernoulli(p).
f (x) =
if x = 1
1 p (= q) if x = 0
Y Binomial(n, p). If X1 , X
P2 , . . . , Xn Bern(p) (i.e.,
n
i=1 Xi
Bin(n, p).
y = 0, 1, . . . , n.
44 / 59
x = 1, 2, . . . .
X Poisson().
Definition: A counting process N (t) tallies the number of arrivals
observed in [0, t]. A Poisson process is a counting process satisfying
the following.
i. Arrivals occur one-at-a-time at rate (e.g., = 4 customers/hr)
ii. Independent increments, i.e., the numbers of arrivals in disjoint
time intervals are independent.
iii. Stationary increments, i.e., the distribution of the number of
arrivals in [s, s + t] only depends on t.
X Pois() is the number of arrivals that a Poisson process
experiences in one time unit, i.e., N (1).
f (x) =
e x
,
x!
x = 0, 1, . . . .
t 1)
.
46 / 59
(ba)2
12 ,
MX (t) =
a+b
2 ,
X Gamma(, ).
f (x) =
x1 ex
,
()
x 0,
t1 et dt.
h
i
E[X] = /, Var(X) = /2 , MX (t) = /( t) for t < .
P
iid
If X1 , X2 , . . . , Xn Exp(), then Y ni=1 Xi Gamma(n, ).
The Gamma(n, ) is also called the Erlangn (). It has cdf
FY (y) = 1 ey
n1
X
j=0
(y)j
,
j!
y 0.
48 / 59
2(xa)
(ba)(ca) if a < x b
2(cx)
.
f (x) =
(cb)(ca) if b < x c
0
otherwise
E[X] = (a + b + c)/3.
a, b > 0.
E[X] =
a
a+b
(a+b) a1
(1
(a)(b) x
and
Var(X) =
b)2 (a
+ b + 1)
.
49 / 59
x R.
50 / 59
Limit Theorems
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
52 / 59
Limit Theorems
Limit Theorems
Corollary (of theorem from 7): If X1 , . . . , Xn are iid Nor(, 2 ),
n Nor(, 2 /n).
then the sample mean X
This is a special case of the Law of Large Numbers, which says that
n approximates well as n becomes large.
X
Definition: The sequence of RVs Y1 , Y2 , . . . with respective cdfs
FY1 (y), FY2 (y), . . . converges in distribution to the RV Y having cdf
FY (y) if limn FYn (y) = FY (y) for all y belonging to the
d
Limit Theorems
iid
110 100
90 100
Z100
Xi 110
= P
P 90
100
100
i=1
P (1 Nor(0, 1) 1) = 0.683.
54 / 59
Limit Theorems
P
Remark: In the above example, the actual distribution of 100
i=1 Xi is
Erlang. One could have calculated the exact probability (assuming
proper statistical software), and it turns out that the CLT hits the right
answer on the nose.
Exercise: Demonstrate that the CLT actually works.
1
Now X1 + X2 + X3 .
55 / 59
Statistics Tidbits
Outline
1
Preliminaries
Great Expectations
Limit Theorems
Statistics Tidbits
56 / 59
Statistics Tidbits
Statistics Tidbits
For now, suppose that X1 , X2 , . . . , Xn are iid from some distribution
with finite mean and finite variance 2 .
n ] = , i.e., X
n is
For this iid case, we have already seen that E[X
unbiased for .
Definition: The sample variance is S 2 =
1
n1
Pn
i=1 (Xi
n )2 .
X
Statistics Tidbits
Xn + z/2
,
Xn z/2
n
n
where z is the 1 quantile of the standard normal distribution, i.e.,
z 1 (1 ).
58 / 59
Statistics Tidbits
,
2 ,n1
21 ,n1
2
59 / 59