Sie sind auf Seite 1von 37

Introduction to probability

Session 4
Uncertainties

Managers often base their decisions on an analysis


of uncertainties such as the following:

What are the chances that sales will decrease


if we increase prices?

What is the likelihood a new assembly method


will increase productivity?

What are the odds that a new investment will


be profitable?
Probability

Probability is a numerical measure of the likelihood


that an event will occur.

Probability values are always assigned on a scale


from 0 to 1.

A probability near zero indicates an event is quite


unlikely to occur.

A probability near one indicates an event is almost


certain to occur.
Probability as a Numerical Measure
of the Likelihood of Occurrence
Increasing Likelihood of Occurrence

0 .5 1
Probability:

The event The occurrence The event


is very of the event is is almost
unlikely just as likely as certain
to occur. it is unlikely. to occur.
An Experiment and Its Sample Space
An experiment is any process that generates well-
defined outcomes.

The sample space for an experiment is the set of


all experimental outcomes.

An experimental outcome is also called a sample


point.
An Experiment and Its Sample Space

Experiment Experiment Outcomes


Toss a coin Head, tail
Inspection a part Defective, non-defective
Conduct a sales call Purchase, no purchase
Roll a die 1, 2, 3, 4, 5, 6
Play a football game Win, lose, tie
Assigning Probabilities

 Basic Requirements for Assigning Probabilities

1. The probability assigned to each experimental


outcome must be between 0 and 1, inclusively.

0 < P(Ei) < 1 for all i

where:
Ei is the ith experimental outcome
and P(Ei) is its probability
Assigning Probabilities

 Basic Requirements for Assigning Probabilities

2. The sum of the probabilities for all experimental


outcomes must equal 1.

P(E1) + P(E2) + . . . + P(En) = 1

where:
n is the number of experimental outcomes
Assigning Probabilities
Classical Method
Assigning probabilities based on the assumption
of equally likely outcomes

Relative Frequency Method


Assigning probabilities based on experimentation
or historical data

Subjective Method
Assigning probabilities based on judgment
Classical Method
 Example: Rolling a Die
If an experiment has n possible outcomes, the
classical method would assign a probability of 1/n
to each outcome.

Experiment: Rolling a die


Sample Space: S = {1, 2, 3, 4, 5, 6}
Probabilities: Each sample point has a
1/6 chance of occurring
Relative Frequency Method
 Example: Assembly Lines
Probabilities are assigned for the number of break
downs of the corresponding assembly line over the
last year.
Office records show the following frequencies of
breakdowns for the last 365 days.
Number
Assembly Line
of Breakdowns
A 4
B 6
C 18
D 10
E 2
Relative Frequency Method
 Example: Assembly Line
Each probability assignment is given by dividing
the frequency (number of days) by the total frequency
(total number of days).
Number
Assembly Line Probability
of Breakdowns
A 4 .10
B 6 .15
C 18 .45 4/40
D 10 .25
E 2 .05
40 1.00
Subjective Method
 When economic conditions and a company’s
circumstances change rapidly it might be
inappropriate to assign probabilities based solely on
historical data.
 We can use any data available as well as our
experience and intuition, but ultimately a probability
value should express our degree of belief that the
experimental outcome will occur.
 The best probability estimates often are obtained by
combining the estimates from the classical or relative
frequency approach with the subjective estimate.
Events and Their Probabilities

An event is a collection of sample points.

The probability of any event is equal to the sum of


the probabilities of the sample points in the event.

If we can identify all the sample points of an


experiment and assign a probability to each, we
can compute the probability of an event.
Some Basic Relationships of
Probability
There are some basic probability relationships that
can be used to compute the probability of an event
without knowledge of all the sample point probabilities.
Complement of an Event

Union of Two Events

Intersection of Two Events

Mutually Exclusive Events


Basic Relationships of Probability-Mutually
Exclusive and Completely Exhaustive
Complement of Event Union of Events
Mutually Exclusive Events*

A Ac A B A B A B

Intersection of Events
The addition law provides a way to compute the
probability of event A, or B, or both A and B occurring.

The law is written as:


P(A B) = P(A) + P(B) - P(A  B*
*=0
Different types of probabilities

Dependent Events Independent Events


• Joint Probability-P(A,B)=P(B,A) • Joint Probability
• Marginal Probability • Marginal Probability
• Conditional Probability- • Conditional Probability-Tree
• P(B|A)=P(B,A)/P(A) diagram
• Conditional = Marginal
• P(B|A)=P(B)
Marginal & Joint Probabilities In A
Contingency Table

Event
Event B1 B2 Total
A1 P(A1 and B1) P(A1 and B2) P(A1)
A2 P(A2 and B1) P(A2 and B2) P(A2)

Total P(B1) P(B2) 1

Joint Probabilities Marginal (Simple) Probabilities


Example 1…
Why are some mutual fund managers more successful than others?
One possible factor is where the manager earned his or her MBA. The
following table compares mutual fund performance against the ranking
of the school where the fund manager earned their MBA:

Mutual fund outperforms Mutual fund doesn’t


the market outperform the market
Goldman Sachs .11 .29
J.P. Morgan Chase .06 .54

E.g. This is the probability that a mutual fund


outperforms AND the manager was in a top-
20 MBA program; it’s a joint probability.
Example 1…
Alternatively, we could introduce shorthand notation to represent the
events:
A1 = Goldman Sachs
A2 = JPMC
B1 = Fund outperforms the market
B2 = Fund does not outperform the market

B1 B2
A1 .11 .29
A2 .06 .54

E.g. P(A2 and B1) = .06


= the probability a fund outperforms the market
and JPMC managed fund
Marginal Probabilities…
Marginal probabilities are computed by adding across rows and down
columns; that is they are calculated in the margins of the table:
P(A2) = .06 + .54
“what’s the probability a fund
Is managed by JPMC?”

B1 B2 P(Ai)
A1 .11 .29 .40
A2 .06 .54 .60
P(Bj) .17 .83 1.00

P(B1) = .11 + .06 BOTH margins must add to 1


“what’s the probability a fund (useful error check)
outperforms the market?”
Computing Joint and Marginal Probabilities

• The probability of a joint event, A and B:


number of outcomes satisfying A and B
P( A and B) =
total number of elementary outcomes

• Computing a marginal (or simple) probability:


P(A) = P(A and B1)  P(A and B2 )    P(A and Bk )
• Where B1, B2, …, Bk are k mutually exclusive and collectively
exhaustive events
Conditional Probability…
Conditional probability is used to determine how two events are
related; that is, we can determine the probability of one event given
the occurrence of another related event.

Conditional probabilities are written as P(A | B) and read as “the


probability of A given B” and is calculated as:
Conditional Probability…
Again, the probability of an event given that another event has
occurred is called a conditional probability…

Note how “A given B” and “B given A” are related…


Conditional Probability…
Example 2 What’s the probability that a fund will outperform the
market given that the fund is managed by GS?
Recall:
A1 = Goldman Sachs
A2 = JPMC
B1 = Fund outperforms the market
B2 = Fund does not outperform the market

Thus, we want to know “what is P(B1 | A1) ?”


Conditional Probability…
We want to calculate P(B1 | A1)

B1 B2 P(Ai)
A1 .11 .29 .40
A2 .06 .54 .60

P(Bj) .17 .83 1.00

Thus, there is a 27.5% chance that that a fund will outperform the market
given that the fund is managed by GS
Marginal Probability in terms of joint probability

• Marginal probability for event A:

P(A) = P(A and B1)  P(A and B2 )    P(A and Bk )

P(A) = P(A | B1)P(B1)  P(A | B2 )P(B2 )    P(A | Bk )P(Bk )

• Where B1, B2, …, Bk are k mutually exclusive and collectively exhaustive


events
Previous Example
A B O/P U/P P(Ai)
GS .11 .29 .40
JPMC .06 .54 .60
P(Bj) .17 .83 1.00

Second branch [conditional] JOINT


First branch [PRIOR] P(GS and O/P) = 0.11

The probabilities
associated with any set of P(GS and U/P) = 0.29
branches from one “node”
must add up to 1.00… P(JPMC and O/P) = 0.06

P(JPMC and U/P) = 0.54


Probability Trees…Dependent Events
The probabilities associated with any set of branches from one “node”
must add up to 1.00…
Joint probabilities
First selection Second selection
P(FF)=(3/10)(2/9)
2/9 + 7/9
= 9/9 = 1
P(FM)=(3/10)(7/9)
P(MF)=(7/10)(3/9)
3/9 + 6/9
= 9/9 = 1
Handy way to check
3/10 + 7/10 P(MM)=(7/10)(6/9)
your work !
= 10/10 = 1
Independence…
One of the objectives of calculating conditional probability is to determine
whether two events are related.

In particular, we would like to know whether they are independent, that is, if
the probability of one event is not affected by the occurrence of the other
event.

Two events A and B are said to be independent if


P(A|B) = P(A)
or
P(B|A) = P(B)
Example for independent events
Suppose we have our grad class of 10 students again, but make the
student sampling independent, that is “with replacement” – a student
could be picked first and picked again in the second round. Our tree
and joint probabilities now look like:
FF P(FF)=(3/10)(3/10)

FM P(FM)=(3/10)(7/10)
MF P(MF)=(7/10)(3/10)

MM P(MM)=(7/10)(7/10)
Probability Trees…
Note: there is no requirement that the branches splits be binary, nor
that the tree only goes two levels deep, or that there be the same
number of splits at each sub node…
Bayes’ Theorem
 To find the posterior probability that event Ai will
occur given that event B has occurred, we apply
Bayes’ theorem.

P( Ai )P( B| Ai )
P( Ai |B) =
P( A1 )P( B| A1 )  P( A2 )P( B| A2 )  ...  P( An )P( B| An )

 Bayes’ theorem is applicable when the events for


which we want to compute posterior probabilities
are mutually exclusive and their union is the entire
sample space.
Bayes’ Theorem Example

• A drilling company has estimated a 40% chance of


striking oil for their new well.
• A detailed test has been scheduled for more
information. Historically, 60% of successful wells
have had detailed tests, and 20% of unsuccessful
wells have had detailed tests.
• Given that this well has been scheduled for a
detailed test, what is the probability
that the well will be successful?
Bayes’ Theorem Example
(continued)

• Let S = successful well


U = unsuccessful well
• P(S) = 0.4 , P(U) = 0.6 (prior probabilities)
• Define the detailed test event as D
• Conditional probabilities:
P(D|S) = 0.6 P(D|U) = 0.2
• Goal is to find P(S|D)
Bayes’ Theorem Example
(continued)

Apply Bayes’ Theorem:


P(D | S)P(S)
P(S | D) =
P(D | S)P(S)  P(D | U)P(U)
(0.6)(0.4)
=
(0.6)(0.4)  (0.2)(0.6)
0.24
= = 0.667
0.24  0.12

So the revised probability of success, given that this well has


been scheduled for a detailed test, is 0.667
Probability and Expectations
• E[X]; E[X^2]; Var (X)

Das könnte Ihnen auch gefallen