Sie sind auf Seite 1von 5

Graphic demonstration of a Bayesian model solution of the

Monty Hall problem


(switch to presenters mode
to see animations)

Assumptions

You chose door A


Monty does know where the prize is
Monty never opens the door that has the prize
Monty never opens the door you chose (A in this
case)
If the prize is behind door A, Monti doesnt have a
preference for door B or door C

Before Monty opens any door, this is what


you know:

Door A

Door B

Door C

Prior
probabilities
The probability that
the prize is behind A
is 1/3
P(A)=1/3
1

The probability that


the prize is behind
B is 1/3
P(B)=1/3

The probability that


the prize is behind B
is 1/3

P(C)=1/3

What is the probability that Monty opens


door B?

Door A

Door B

Door C

You know that when the prize is behind B, the probability of Monty
opening it is 0 P(Monty1 opens B|Prize is behind B)=0
1

P(Monty opens B|Prize is behind B)*P(Prize is behind B)=0*1/3= 0


1

Because Monty has no preference for door B or door C, when the


1
prize is behind
A, the probability
thatishe
opensA)=1/2
B is 50%
P(Monty
opens B|Prize
behind
1

P(Monty
opens
B|Prize
is behind
A)*P(Prize
is A
behind
These are
the 50%
of cases
when
the prize
is behind
and Monty
1
opens door B
A)=1/2*1/3=1/6
1

When the prize is behind C, Monty will open B for sure because he
never
opens the door that has the prize and he never opens the door
1
P(Monty opens B|Prize is behind )=1
youve chosen (A)
1
P(Monty opens B|Prize is behind C)*P(Prize is behind
1

C)=1*1/3=
1/3B)
opens
P(Monty
=0+1/6+1/3=1/2

After Monty opens any door, this is what you


know:

Door A

Door B

Door C

Posterior
probabilities

After you saw that Monty opens the door B,


1
you know that
you are in this subset of cases:
1

P(Prize is behind A|Monty Opens B)=


1
P(Prize is behind A & Monty
opens B)/P(Monty Opens B) = (1/6)/
(1/2)=1/3
P(Prize is behind C|Monty Opens B)=
1
P(Prize is behind C & Monty opens B)/P(Monty Opens B) = (1/3)/
1

(1/2)=2/3
Therefore, here you switch. However, the solution depends on the
1 prior probabilities and will not always be the same.
1

Das könnte Ihnen auch gefallen