You are on page 1of 1

# Given an event A of a sample space S, denote P(A) as its probability.

Axiom 1. 0 P(A) 1
Axiom 2. P(S) = 1
Axiom 3. If A
1
, A
2
, are mutually exclusive (A
i
A
j
= ), then
P
_

_
i=1
A
i
_
=

i=1
P(A
i
)
Theorem 1 (Inclusion-Exclusion Principle).
P(A
1
A
2
An) =
n

r=1
_
_
(1)
r+1

1i
1
irn
P(A
i
1
A
i
2
A
ir
)
_
_
Denition 1 (Conditional Probability). Probability of B given A is
P(B|A) =
P(AB)
P(A)
1. P(AB) = P(A)P(B|A) = P(B)P(A|B)
2. P(A
1
A
2
An) = P(A
1
)P(A
2
|A
1
)P(A
3
|A
1
A
2
) P(An|A
1
A
n1
)
Theorem 2 (Law of Total Probability).
P(B) =
n

i=1
P(A
i
)P(B|A
i
)
Theorem 3 (Bayes Rule).
P(A
k
|B) =
P(A
k
)P(B|A
k
)

n
i=1
P(A
i
)P(B|A
k
)
For two events,
1. P(B) = P(A)P(B|A) +P(A
C
)P(B|A
C
)
2. P(A|B) =
P(A)P(B|A)
P(A)P(B|A) +P(A
C
)P(B|A
C
)
Denition 2 (Independence). Two events are independent if P(AB) =
P(A)P(B). Also, P(A|B) = P(A) and P(B|A) = P(B).
Denition 3 (Cumulative Distribution Function). F
X
(x) = P{X x}
Denition 4 (Probability Density Function). f
X
(x) = P{X = x}
Denition 5 (Continuous Random Variable). P{X = x} for all x R.
F
X
is a continuous function, with f
X
(x) = F

X
(x)
Denition 6.
E(X) =
_

x
xfx(x) for Discrete RV
_

xf
X
(x) dx for Continuous RV
E(aX +b) = aE(X) +b
Denition 7 (Moment Generating Function). M
X
(t) = E[e
tX
]
If there is a > 0 such that M(t) < for every |t| < , then E(X
k
) =
M
(k)
(0)
Denition 8 (Variance).
2
X
= E[(XEX)
2
] = EX
2
(EX)
2
Var(aX+
b) = a
2
Var(X), V ar(X+Y ) = V ar(X)+V ar(Y ) if X, Y are independent.
Theorem 4 (Tail Sum Formula). For non-negative integer-valued r.v. X,
E[X] =

k=1
P(X k) =

k=0
P(X > k)
Denition 9 (Binomial Distribution). X is the number of success that
occur in n independent Bernoulli trials, denoted as X B(n, p).
b(i; n, p) =
_
n
i
_
p
i
(1 p)
ni
EX = np, Var(X) = npq
Denition 10 (Poisson Distribution). X is the number of occurences
occuring in a specic interval such that the occurences are independent
between intervals, denote as X P()
p(i; ) = e

i
i!
EX = , Var(X) =
Suppose X B(n, p), n is large and p is small (< 0.1), then X P()
with = np.
Denition 11 (Geometric Distribution). X is the number of trials re-
quired until a success is obtained for a repeated Bernoulli Trial. Denote
X Geom(p).
f
X
(n) = (1 p)
n1
p
EX =
1
p
, Var(X) =
1p
p
2
Denition 12 (Negative Binomial Distribution). X is the number of
Bernoulli trials performed until r successes are obtained. Denote as X
NB(r, p).
f
X
(n) =
_
n 1
r 1
_
p
r
(1 p)
nr
EX =
r
p
, Var(X) =
r(1p)
p
2
Denition 13 (Hyper Geometric Distribution). X is the number of white
balls drawn without replacement when n balls are drawn in an urn with N
balls, out of which m are white. Denote X H(n, N, m).
f
X
(n) =
_
m
x
__
Nm
nx
_
_
N
n
_
EX =
nm
N
, Var(X) =
Nn
N1
n
m
N
(1
m
N
)
Denition 14 (Exponential Distribution). A good model for waiting time
scenarios, X Exp() if
f
X
(n) =
_
e
x
, x > 0
0 elsewhere
EX =
1

, Var(X) =
1

2
Exponential and geometric distribution are memoryless i.e. P{X > s +
t|X > t} = P{X > s}
Denition 15 (Gamma Distribution). X (, ) if
f
X
(n) =
_

()
x
1
e
x
, x > 0
0 elsewhere
The Gamma function is dened by
() =
_

0
x
1
e
x
dx
EX =

, Var(X) =

2
Note that
_

0
x
1
e
x
dx =
()

## Denition 16 (Normal Distribution). X N(,

2
) if
n(x; , ) =
1

2
e

(x)
2
2
2
The Gamma function is dened by
() =
_

0
x
1
e
x
dx
EX = , Var(X) =
2
The standard normal r.v. is denoted by Z N(0, 1). For any X
N(,
2
), we have
X

N(0, 1)
Given X B(n, p) such that np(1 p) 10, we have X N(np, npq).
Remark 1 (Continuity Correction). If X B(n, p), then
{X = k} = {k 0.5 < X < k + 0.5}
{X k} = {X k 0.5}
{X k} = {X k + 0.5}