Sie sind auf Seite 1von 27

Bayes’ Theorem

 Often we begin probability analysis with initial or


prior probabilities.
 Then, from a sample, special report, or a product
test we obtain some additional information.
 Given this information, we calculate revised or
posterior probabilities.
 Bayes’ theorem provides the means for revising the
prior probabilities.

Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem
Bayes’ Theorem
 To find the posterior probability that event Ai will
occur given that event B has occurred, we apply
Bayes’ theorem.

P( Ai )P( B| Ai )
P( Ai |B) 
P( A1 )P( B| A1 )  P( A2 )P( B| A2 )  ...  P( An )P( B| An )

 Bayes’ theorem is applicable when the events for


which we want to compute posterior probabilities
are mutually exclusive and their union is the entire
sample space.
Example- An insurance company insured 2000
scooter drivers, 4000 car drivers and 6000 truck
drivers. The probability of an accident involving
a scooter driver, a car driver and truck driver is
0.01, 0.03 and 0.15 respectively. One of the
insured person met with an accident. What is
the probability that he is a scooter driver.
2000 scooter drivers=> P(S)= 2000/12000
4000 car drivers => P(C)= 4000/12000
6000 truck drivers => P(T)= 6000/12000

The probability of an accident involving a scooter


driver, a car driver and truck driver is 0.01, 0.03 and
0.15
P(A|S)=0.01, P(A|C)=0.03 and P(A|T)=0.15
one of the insured person met with an accident.
What is the probability that he is a scooter driver.
P(S|A)= ?
What is the probability that he is a scooter driver.
P(S|A)= ?

𝑃 𝑆 𝑃 𝐴𝑆
𝑃 𝑆𝐴 =
𝑃 𝑆 𝑃 𝐴 𝑆 +𝑃 𝐶 𝑃 𝐴 𝐶 +𝑃 𝑇 𝑃 𝐴 𝑇

(2/12) ∗ (1/100)
2/12 ∗ (1/100) + (4/12) ∗ (3/100) + (6/12) ∗ (15/100)

= 2/104
What is the probability that he is a Car driver.
P(C|A)= ?
What is the probability that he is a truck driver.
P(T|A)= ?
Ex.2
A factory produces 3 machines, A, B and C
producing 1000, 2000 and 3000 bolts per day
respectively.
It is found that A produces 1% defective, B produces
2 % defective and C produces 3% defectives.
If a bolt is chosen at the end of the day and found
defective what is the probability that it came from
machine A
P(A)=1/6, P(B)= 2/6, and P(C)= 3/6
It is found that A produces 1% defective
=>P(D|A)=1/100
B produces 2 % defective
=>P(D|B)=2/100
and C produces 3% defectives
=>P(D|C)=3/100
If a bolt is chosen at the end of the day and
found defective what is the probability that it
came from machine A
=> P(A|D)
P(A|D)
𝑃 𝐴 𝑃 𝐷𝐴
𝑃 𝐴 𝐷 =
𝑃 𝐴 𝑃 𝐷𝐴 +𝑃 𝐵 𝑃 𝐷𝐵 +𝑃 𝐶 𝑃 𝐷𝐶
𝑃 𝐴 𝐷 =
1
∗1/100
6
1 2 =1/14
∗1/100+ 2/100+(3/6)∗3/100
6 6

𝑃 𝐵 𝐷 =?
𝑃 𝐶 𝐷 =?
• Sample Problem
• Bayes' theorem can be best understood through an
example. This section presents an example that
demonstrates how Bayes' theorem can be applied
effectively to solve statistical problems.
• Example 1
x is getting married tomorrow, at an outdoor ceremony in
the desert. In recent years, it has rained only 5 days each
year. Unfortunately, the weatherman has predicted rain for
tomorrow. When it actually rains, the weatherman
correctly forecasts rain 90% of the time. When it doesn't
rain, he incorrectly forecasts rain 10% of the time. What is
the probability that it will rain on the day of x's wedding?
• Solution: The sample space is defined by two mutually-
exclusive events - it rains or it does not rain. Additionally, a
third event occurs when the weatherman predicts rain.
Notation for these events appears below.
• Event A1. It rains on x's wedding.
• Event A2. It does not rain on x's wedding.
• Event B. The weatherman predicts rain.
• In terms of probabilities, we know the following
:P( A1 ) = 5/365 =0.0136985 [It rains 5 days out of
the year.]
• P( A2 ) = 360/365 = 0.9863014 [It does not rain
360 days out of the year.]
• P( B | A1 ) = 0.9 [When it rains, the weatherman
predicts rain 90% of the time.]
• P( B | A2 ) = 0.1 [When it does not rain, the
weatherman predicts rain 10% of the time.]
We want to know P( A1 | B ), the probability it will rain on
the day of x's wedding, given a forecast for rain by the
weatherman. The answer can be determined from
Bayes' theorem, as shown below.
P( A1 | B ) = P( A1 ) P( B | A1 )
P( A1 ) P( B | A1 ) + P( A2 ) P( B | A2 )
P( A1 | B ) =(0.014)(0.9) / [ (0.014)(0.9) + (0.986)(0.1) ]
P( A1 | B ) =0.111
Note the somewhat unintuitive result. Even when the
weatherman predicts rain, it only rains only about 11%
of the time. Despite the weatherman's gloomy
prediction, there is a good chance that x will not get
rained on at her wedding.
Bayes’ Theorem

 Example: L. S. Clothiers
A proposed shopping center will provide strong
competition for businesses like L. S.Clothiers. If
the shopping center is built, the owner of L. S.
Clothiers feels it would be best to relocate to
the shopping center.
The shopping center cannot be built unless a
zoning change is approved by the town council.
The planning board must first make a
recommendation, for or against the zoning change,
to the council.
Prior Probabilities

 Example: L. S. Clothiers
Let:
A1 = town council approves the zoning change
A2 = town council disapproves the change

Using subjective judgment:

P(A1) = .7, P(A2) = .3


New Information
 Example: L. S. Clothiers
The planning board has recommended against
the zoning change. Let B denote the event of a
negative recommendation by the planning board.

Given that B has occurred, should L. S. Clothiers


revise the probabilities that the town council will
approve or disapprove the zoning change?
Conditional Probabilities

 Example: L. S. Clothiers
Past history with the planning board and the town
council indicates the following:

P(B|A1) = .2 P(B|A2) = .9

Hence: P(BC|A1) = .8 P(BC|A2) = .1


Tree Diagram

 Example: L. S. Clothiers

Town Council Planning Board Experimental


Outcomes

P(B|A1) = .2
P(A1  B) = .14
P(A1) = .7
c
P(B |A1) = .8 P(A1  Bc) = .56

P(B|A2) = .9
P(A2  B) = .27
P(A2) = .3
c
P(B |A2) = .1 P(A2  Bc) = .03
Bayes’ Theorem
 To find the posterior probability that event Ai will
occur given that event B has occurred, we apply
Bayes’ theorem.

P( Ai )P( B| Ai )
P( Ai |B) 
P( A1 )P( B| A1 )  P( A2 )P( B| A2 )  ...  P( An )P( B| An )

 Bayes’ theorem is applicable when the events for


which we want to compute posterior probabilities
are mutually exclusive and their union is the entire
sample space.
Posterior Probabilities
 Example: L. S. Clothiers
Given the planning board’s recommendation not
to approve the zoning change, we revise the prior
probabilities as follows:
P( A1 )P( B| A1 )
P( A1 |B) 
P( A1 )P( B| A1 )  P( A2 )P( B| A2 )
(. 7 )(. 2 )

(. 7 )(. 2 )  (. 3)(. 9)
= .34
Posterior Probabilities

 Example: L. S. Clothiers
The planning board’s recommendation is good
news for L. S. Clothiers. The posterior probability of
the town council approving the zoning change is .34
compared to a prior probability of .70.
Bayes’ Theorem: Tabular Approach
 Example: L. S. Clothiers
• Step 1
Prepare the following three columns:
Column 1 - The mutually exclusive events for
which posterior probabilities are desired.
Column 2 - The prior probabilities for the events.
Column 3 - The conditional probabilities of the
new information given each event.
Bayes’ Theorem: Tabular Approach
 Example: L. S. Clothiers
• Step 1
(1) (2) (3) (4) (5)
Prior Conditional
Events Probabilities Probabilities
Ai P(Ai) P(B|Ai)

A1 .7 .2
A2 .3 .9
1.0
Bayes’ Theorem: Tabular Approach
 Example: L. S. Clothiers
• Step 2
Prepare the fourth column:
Column 4
Compute the joint probabilities for each event and
the new information B by using the multiplication
law.
Multiply the prior probabilities in column 2 by
the corresponding conditional probabilities in
column 3. That is, P(Ai IB) = P(Ai) P(B|Ai).
Bayes’ Theorem: Tabular Approach

 Example: L. S. Clothiers
• Step 2 (continued)
We see that there is a .14 probability of the town
council approving the zoning change and a
negative recommendation by the planning board.
There is a .27 probability of the town council
disapproving the zoning change and a negative
recommendation by the planning board.
Bayes’ Theorem: Tabular Approach
 Example: L. S. Clothiers
– Step 3
Sum the joint probabilities in Column 4. The
sum is the probability of the new information,
P(B). The sum .14 + .27 shows an overall
probability of .41 of a negative recommendation
by the planning board.
Bayes’ Theorem: Tabular Approach
 Example: L. S. Clothiers
• Step 4
Prepare the fifth column:
Column 5
Compute the posterior probabilities using the
basic relationship of conditional probability.
P( Ai  B)
P( Ai | B) 
P( B)
The joint probabilities P(Ai I B) are in column 4
and the probability P(B) is the sum of column 4.
Bayes’ Theorem: Tabular Approach
 Example: L. S. Clothiers
• Step 4
(1) (2) (3) (4) (5)
Prior Conditional Joint Posterior
Events Probabilities Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B) P(Ai |B)

A1 .7 .2 .14 .3415

A2 .3 .9 .27 .6585
1.0 P(B) = .41 1.0000
.14/.41

Das könnte Ihnen auch gefallen