Beruflich Dokumente
Kultur Dokumente
called outcomes.
Probability starts with logic. There is a set of N elements. We can define a sub-
set of n favorable elements, where n is less than or equal to N. Probability is
defined as the rapport of the favorable cases over total cases, or calculated as:
P=n/N
NOTES
An urn contains w white balls and b black balls (w > 0 and b > 0). The balls are
thoroughly mixed and two are drawn, one after the other, without replacement.
Let Wi and Bi denote the respective outcomes 'white on the ith draw' and 'black
on the ith draw,' for i = 1, 2.
P(W2 ) = P(W1) = w/(w + b). (Which clearly implies a similar identity for B2 and
B1 .)
Furthermore, P(Wi ) = w/(w + b), for any i not exceeding the total number of
balls w + b.
Combining Events
Let S = {s 1 , s 2, ... , sn} be a sample space and let P(s i ) be the estimated
probability of the event {s i }. Then
(a) 0 ≤ P(s i ) ≤ 1
(b) P(s1) + P(s 2 ) + ... + P(s n) = 1
(c) If E = {e 1, e2, ..., er }, then P(E) = P(e1 ) + P(e 2 ) + ... + P(e r).
In words:
Empirical Probability
Notes
1. We write P(E) for both estimated and empirical probability. Which one we are
referring to should always be clear from the context.
2. Empirical probability satisfies the same properties (shown above) as estimated
probability:
Abstract Probability
(a) 0 ≤ P(s i ) ≤ 1
(b) P(s1 ) + P(s 2) + ... + P(s n) = 1.
Notes
This holds true also for more events: If E1, E2, . . . , En are mutually exclusive
events (that is, the intersection of any pair of them is empty) and E is the union
of E1, E2, . . . , En, then
The following are true for any sample space S and any event E.
Conditional Probability
If E and F are two events, then the conditional probability, P(E|F), is the
probability that E occurs, given that F occurs, and is defined by
P(E ∩ F)
P(E|F) = .
P(F)
P(E∩F) = P(F)P(E|F).
fr(E ∩ F)
P(E|F) = .
fr(F)
Independent Events
P(E|F) = P(E)
P(E ∩ F) = P(E)P(F).
If two events E and F are not independent, then they are dependent.
Bayes' Theorem
The short form of Bayes' Theorem states that if E and F are events, then
P(E|F)P(F)
P(F|E) = .
P(E|F)P(F) + P(E|F')P(F')
An expanded form of Bayes' Theorem states that if E is an event, and if F1, F2,
and F3 are a partition of the sample space S, then
P(E|F 1 )P(F 1 )
P(F 1 |E) = .
P(E|F 1 )P(F 1 ) + P(E|F 2)P(F 2) + P(E|F 3 )P(F 3 )