Sie sind auf Seite 1von 75

Probability

the chance that something will happen


expressed as fraction (, , ) or as decimals (0.25, 0.5,
0.75) between 0 and 1
a probability of 0 means that something can never
happen
a probability of 1 means that something will always
happen

Events and Experiments

Event
one or more possible outcomes of doing something
in a toss of coin, getting a tail would be an event; getting
a head would be still another event

Experiment
the activity that produces an event

Example
In a coin-toss experiment, what is the probability of the event heads?

Ans. or 0.5

Sample Space

Sample Space
the set of all possible outcomes of an experiment
in a coin-toss experiment, the sample space is: S =
{heads, tails}, assuming the coin does not land on its
edge
in drawing cards from a deck, the sample space would
have 52 members, one for each card in a deck (ace of
hearts, king of hearts, and so on)

Mutually exclusive and collectively exhaustive events

events are mutually exclusive if one and only one of them
can take place at a time
example: in a coin-toss experiment, there are two possible
outcomes: heads and tails. On any single toss, either a head or a
tail may turn up, but not both. Accordingly, the events heads
and tails on a single toss are said to be mutually exclusive
if two or more events occur at one time, the events are not
mutually exclusive; i.e., if two events can possibly occur
together
a collectively exhaustive events are possible events that can
result from an experiment that includes every possible
outcome

The classical approach

The relative frequency approach

The subjective approach

Classical Probability

Classical Probability

defines the probability that an event will occur as




in order for the equation to be valid, each of the
outcomes must be equally likely (i.e., equal chance of
occurrence)

outcomes possible of number Total
event the of occurrence the to favorable outcomes of Number
event an of y Probabilit =
Classical Probability

Example 1
In an experiment of tossing a coin, what is the probability of
getting heads on one toss?

Answer





( )
( )
2
1
1 1
1
=
+
=
heads P
heads P
Number of outcomes of one toss
favorable to the occurrence of the
event (in this case, the number
that will produce heads)
Total number of possible
outcomes of one toss (in this case,
heads and tails)
Classical Probability

Example 2
In an experiment of rolling a die, what is the probability of
rolling a five on one die?

Answer



( )
( )
6
1
5
1 1 1 1 1 1
1
5
=
+ + + + +
=
P
P
Number of outcomes of one roll of
the die which will produce a 5
Total number of possible
outcomes of one roll of the die
(getting a 1, a 2, a 3, a 4, a 5, or a 6)
Classical Probability

Classical Probability is often called a priori probability because
if we keep using these orderly examples of fair coins, unbiased
dice, or decks of cards, we can state the answer in advance (a
priori) without ever tossing a coin, rolling a die, or drawing a
card, i.e., we can make probability statements based on logical
reasoning before any experiment takes place


The Relative Frequency Approach

this method of defining probability uses the relative
frequencies of past occurrences as probabilities; i.e., to
predict the probability that something will happen in
the future, we determine how often it happened in the
past
this defines probability as either:
the proportion of times that an event occurs in the long run
when conditions are stable
the observed relative frequency of an event in a very large
number of trials


The Relative Frequency Approach

Example
Suppose the college admission office knows from past data that about 50 of its
1,000 entering freshmen usually leave school for academic reasons by the end of
the first semester. What is the probability that a freshman leaves school for
academic reasons by the end of the first semester?

Answer
Using this method, the school would estimate the probability of a freshman
leaving school for academic reasons by the end of the first semester as



( )
semester first the of end at the reasons academic for school leaves freshman a event that x
where
05 . 0
1000
50
=
= = x P
The subjective approach

Subjective Probability

based on the personal belief or feelings of the person
who makes the probability estimates
probability assigned to an event on the basis of whatever
evidence is available.


Marginal Probability

Joint Probability

Conditional Probability
Some Commonly Used Symbols

Marginal Probability
the probability of a single event A is expressed symbolically as





a single probability means that only one event can take place;
it is called a marginal or unconditional probability

( ) happening A event of y Probabilit A P
Some Commonly Used Symbols

Marginal Probability

Example
Suppose 10 computer programmers have an equal chance of being
promoted to programming supervisor
the possible events are mutually exclusive, i.e., a
programmers chance is 1 in 10 and only 1 programmer can get
promoted to supervisor at a time

Answer

( ) 1 . 0
10
1
= = promotion P
Some Commonly Used Symbols

The Addition Rule for Mutually Exclusive Events
If two events are mutually exclusive, the probability can be
symbolically expressed (using the addition rule for mutually
exclusive events) as:



And is calculated as:



( ) happening B or A event of y Probabilit B or A P
( ) ( ) ( ) B P A P B or A P + =

The Addition Rule for Mutually Exclusive Events

Example
Consider the following data for 50 welders in a fabrication shop














What is the probability that a welder selected at random will have 6
or more years of experience?

Years of
Experience
Number Probability
0 - 2 5
3 - 5 10
6 - 8 15
More than 8 20
TOTAL 50
1 0
10
1
50
5
. = =
2 0
5
1
50
10
. = =
3 0
10
3
50
15
. = =
4 0
5
2
50
20
. = =
( ) ( ) ( )
( ) 7 . 0 4 . 0 3 . 0 more or 6
8 than more 8 to 6 more or 6
= + =
+ =
P
P P P
Some Commonly Used Symbols

The Addition Rule for Events that are N0t Mutually
Exclusive
If two events are not mutually exclusive, the probability can
be calculated as:



( ) ( ) ( ) ( ) B and A P B P A P B or A P + =
Probability of
event A
happening
Probability of
event B
happening
Probability of
event A and B
happening
together
Probability of
event A or B
happening when
A and B are not
mutually
exclusive

The Addition Rule for Events that are N0t Mutually
Exclusive

Note:


in a mutually exclusive event, P(A and B) = 0
if not deducted from the equation, will result in double
counting





( ) ( ) ( ) ( ) B and A P B P A P B or A P + =
A B
A and B

The Addition Rule for Events that are Not Mutually Exclusive

Example
The City Council of Chapel Hill, North Carolina is composed of the following 5
persons:











If the members of the council decide to elect a chairperson by random draw (say,
by drawing the names from a hat), what is the probability that the chairperson
will either female or over 35?

Person Sex Age
1 Male 31
2 Male 33
3 Female 46
4 Female 29
5 Male 41
( ) ( ) ( ) ( )
( ) 6 . 0
5
1
5
2
5
2
= + =
+ =
35 over or female P
35 over and female P 35 over P female P 35 over or female P
Marginal Probabilities under Statistical Independence

Marginal Probability

Is the simple probability of the occurrence of an event

Example 1: Outcomes in the experiment of tossing a fair coin

Probabilities on each individual toss: P(H) = 0.5 and P(T) = 0.5

the probability of heads equals 0.5 and the probability of tails equals 0.5
this is true for every toss, no matter how many tosses may precede it or
what their outcomes may be
every outcome (outcome of a toss) stands alone and is in no way
connected with any other event (outcome of another toss); thus the
outcome of each toss of a coin is a statistically independent event

Marginal Probabilities under Statistical Independence

Marginal Probability

Example 2
Outcomes in the experiment of tossing a biased or unfair coin, in
which heads occurs 0.90 of the time and tails 0.10 of the times

Probabilities on each individual toss: P(H) = 0.9 and P(T) = 0.1

the outcome of any particular toss is completely unrelated to the
outcomes of the tosses which may precede it as well as to the outcomes
which may follow, hence, the outcome of each toss of a coin is a
statistically independent event

Joint Probabilities under Statistical Independence

Joint Probability

the probability that two or more independent events will occur
together or in succession is the product of their marginal
probabilities.

Mathematically, this is defined as








( ) ( ) ( )
( )
( )
( ) occurring B event of y probabilit arginal m B P
occurring A event of y probabilit arginal m A P
Y PROBABILIT JOINT n; successio in together occuring B and A events of y probabilit AB P
where
B P A P AB P
=
=
=
=
Joint Probability

Example 1
In a coin-toss experiment, what is probability of heads appearing
on two successive tosses?

Answer










( ) ( ) ( )
( )
25% or
4
1
or 0.25 is tosses e successiv two on heads of y probabilit the
H H P
thus t, independen ally statistic are events two the
toss second the on heads H
toss f irst the on heads H
where
H P H P H H P
2
1

= =
=
=
=
25 . 0 5 . 0 5 . 0
2 1
2 1 2 1
Joint Probability

Example
In a coin-toss experiment, what is probability of heads appearing
on two successive tosses?

Answer










0.5
0.5
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
Joint Probability

Example 2
Assume that an unfair coin is tossed which has P(H) = 0.9 and P(T)
= 0.1. What is the probability of getting three heads on three
successive tosses?

Answer










( ) ( ) ( ) ( )
( ) 729 . 0 9 . 0 9 . 0 9 . 0
3 2 1
3 2 1 3 2 1
= =
=
=
=
H H H P
thus t, independen ally statistic are events two the
toss second the on heads H
toss f irst the on heads H
where
H P H P H P H H H P
2
1
Joint Probability

Example 3
What is the probability of at least one head appear on two tosses of
a fair coin?

Answer
The possible ways a head may occur are: H
1
H
2
, H
1
T
2
, T
1
H
2;
each of
these has a probability of 0.25, hence











( ) ( ) ( ) ( )
( )
( )
( ) ( )
( ) 75 . 0 25 . 0 1 tosses 2 on appear head one least at
1 tosses 2 on appear head one least at
ely, alternativ
75 . 0 tosses 2 on appear head one least at
25 . 0 25 . 0 25 . 0 tosses 2 on appear head one least at
tosses 2 on appear head one least at
2 1
2 1 2 1 2 1
= =
=
=
+ + =
+ + =
P
T T P P
P
P
H T P T H P H H P P
Conditional Probabilities under Statistical
Independence

Conditional Probability

Symbolically, conditional probability is written P(B|A)
and is read









( ) occurred has A event that given B event of y Probabilit A B P
Conditional Probability

For statistically independent events, the conditional
probability of event B given that event A has occurred is the
same as the unconditional probability of event B;
symbolically,













( ) ( ) B P A B P =
Conditional Probability

Example
What is the probability that the second toss of a fair coin will
result heads, given that heads occurred on the first toss?

Answer








( ) ( ) 5 . 0
2 1 2
= = H P H H P
Statistical dependence exists when the probability of
some event is dependent upon or affected by the
occurrence of some other events.

Marginal Probabilities under Statistical Dependence

the marginal probability of a statistically dependent
event is exactly the same as that of a statistically
independent event; one and only one probability is
involved; a marginal probability refers to only one event

Conditional Probabilities under Statistical
Dependence

computed as:



( )
( )
( ) B P
AB P
B A P =
Example
Assume we have one urn containing 10 balls distributed as follows



3 are red and dotted; 1 is red and striped;
2 are gray and dotted; 4 are gray and striped

The probability of drawing any particular ball from the urn is 0.1, since there are
10 balls, each with equal probability of being drawn
Suppose someone draws a ball from the urn and tells us it is red. What is the
probability that it is dotted?

Answer

2 Separate Categories Color Pattern
Red Dotted
Gray Striped
( )
( )
( )
75 . 0
4
3
= = =
R P
DR P
R D P
Joint Probabilities under Statistical Dependence

calculated as:



( ) ( ) ( ) B P B A P AB P =
TYPE OF
PROBABILITY
SYMBOL
FORMULA
(under
statistical
dependence)
FORMULA
(under
statistical
independence)
Marginal P(A) P(A) P(A)
Joint P(AB) P(A|B) x P(B) P(A) x P(B)
Conditional P(A|B) P(AB)/P(B) P(A)
The basic formula for conditional probability under
conditions of statistical dependence is called Bayes
theorem
What is the probability of getting tails, heads, tails,
in that order on three successive tosses of a fair
coin? Show your solution using

a. Probability calculation
b. Probability tree diagram

a. Using Probability Calculation











( ) ( ) ( ) ( )
( )
( )
0.125 is tosses e successiv two on heads of y probabilit the
T H T P
T H T P
toss third the on tails T
toss second the on heads H
toss f irst the on tails T
where
T P H P T P T H T P
3
2
1

=
=
=
=
=
=
125 . 0
5 . 0 5 . 0 5 . 0
3 2 1
3 2 1
3 2 1 3 2 1
b. Using Probability Tree











0.5
0.5
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
0.25
0.25
P(H) = 0.5
P(T) = 0.5
0.125
0.125
P(H) = 0.5
P(T) = 0.5
0.125
0.125
P(H) = 0.5
P(T) = 0.5
0.125
0.125
P(H) = 0.5
P(T) = 0.5
0.125
0.25
P(H) = 0.5
P(T) = 0.5
What is the probability of at least one tail on three
successive tosses a fair coin?

a. Using Probability Calculation











( ) ( )
( )
( ) 875 . 0
125 . 0 1
1
3 2 1
3 2 1
=
=
=
tosses three on appear tail one least at P
tosses three on appear tail one least at P
H H H P tosses three on appear tail one least at P
hence
H H H namely occur, tails no which in case one only is There
Assume we have one urn containing 10 balls
distributed as follows:






a. What is P(D|G)?
b. What is P(S|G)?
c. Calculate P(R|D) and P(G|D)
d. Calculate P(R|S) and P(G|S)

2 Separate Categories Color Pattern
Red Dotted
Gray Striped
3 are red and dotted; 1 is red and striped;
2 are gray and dotted; 4 are gray and striped
a. b.



c. d.
( )
( )
( ) 3
1
6
2
= = =
G P
DG P
G D P ( )
( )
( ) 3
2
6
4
= = =
G P
SG P
G S P
( )
( )
( )
6 . 0
5
3
= = =
D P
RD P
D R P
( )
( )
( )
4 . 0
5
2
= = =
D P
GD P
D G P
( )
( )
( )
2 . 0
5
1
= = =
S P
RS P
S R P
( )
( )
( )
8 . 0
5
4
= = =
S P
GS P
S G P
Consider the case of a manufacturer who has an
automatic machine which produces ball bearings.
If the machine is correctly set up, i.e., properly
adjusted, it produces 90 percent acceptable parts.
If it is incorrectly set up, it produces 40 percent
acceptable parts. Past experience indicates that 70
percent of the setups are correctly done. After a
certain setup, the machine produces three
acceptable bearings as the first three pieces. What
is the probability that the setup has been correctly
done?

a.



( )
( )
( )
( )
( )
( )
percent 96.37 or 0.9637 is up set correctly is machine y that the probabilit
9637 . 0
5295 . 0
5103 . 0
parts good 3
parts good 3 ,
parts good 3
the
P
correct P
correct P
B P
AB P
B A P

= = =
=
Event P(event) P(1 good
part|event)
P(3 good
parts|event)
P(event, 3 good
parts)
Correct 0.70 0.90 0.729 0.729 x 0.70 = 0.5103
Incorrect 0.30 0.40 0.064 0.064 x 0.30 = 0.0192
1.00 P(3 good) = 0.5295
Probability Distribution

A list of the outcomes of an experiment with the
probabilities we would expect to see associated with
these outcomes


Discrete and Continuous Distributions

Discrete Probability Distribution

A probability distribution in which the variable is allowed to
take on only a limited number of values

Continuous Probability Distribution

A probability distribution in which the variable is permitted
to take on any value within a given range

Probability Distribution


Can be expressed graphically or in tabular form

0
0.1
0.2
0.3
0.4
0.5
0.6
0 1 2
Probability of a tail in 2 tosses of a fair coin
Probability of this
outcome, P(T)
Number of
tails, T
Probability of this
outcome, P(T)
0 0.25
1 0.5
2 0.25
Random Variables

Random variable
A variable that takes on different values as a result of the
outcomes of a random experiment

Can be discrete or continuous:

A discrete random variable is a variable allowed to take on
only a limited number of values
A continuous random variable is a random variable allowed to
take on any value within a given range

Random Variables

The expected value of a random variable
Expected value is a weighted average of the outcomes of an
experiment
The expected value of a discrete random variable is computed
as:


( ) x P x E E =
Expected value of a
random variable
Symbol meaning
the sum of
Value of the
random variable
Probability that the
random variable will
take on the value x
Number of speakers
sold (the random
variable)
No. of days this
quantity sold
(frequency)
Probability that the random
variable will take on this value
(relative frequency)
[1] x [2]
[1] [2]

100 1 0.01 1.00
101 2 0.02 2.02
102 3 0.03 3.06
103 5 0.05 5.15
104 6 0.06 6.24
105 7 0.07 7.35
106 9 0.09 9.54
107 10 0.10 10.70
108 12 0.12 12.96
109 11 0.11 11.99
110 9 0.09 9.90
111 8 0.08 8.88
112 6 0.06 6.72
113 5 0.05 5.65
114 4 0.04 4.56
115 2 0.02 2.30
100 1.00 108.02

0.01
0.02
0.03
0.04
0.05
0.06
0.07
0.08
0.09
0.10
0.11
0.12
Types of Probability Distribution

Binomial distribution

Poisson distribution

Exponential distribution

Normal distribution


Binomial Distribution

A discrete distribution of the results of an experiment
known as a Bernoulli process

Bernoulli process is a process in which each trial has only two
possible outcomes, where the probability of the outcome of
any trial remains fixed over time, and where the trials are
statistically independent

e.g., tossing of a fair coin, success or failure of a college graduate
on a job interview aptitude test


Binomial Distribution

Binomial formula:



( )
trials of number total n
successes of number r
p) - (1 q failure; of y probabilit q
success of y probabilit p
where
! r - n r!
n!
n trials in successes r of robability
=
=
= =
=
=
r n r
q p P
Binomial Distribution
Example:
Some field representatives of the Environmental Protection Agency are
doing spot checks of water pollution in streams. Historically, 8 out of 10
such tests produce favorable results, that is, no pollution. The field
group is going to perform 6 tests and wants to know the chances of
getting exactly 3 favorable results from this group of tests.







( )
( )
( )
( )
6. of out tests favorable 3 getting of 10 in chance 1 than less is There
082 . 0
6
492 . 0
) 0041 . 0 (
6
120
6 of out tests favorable 3 of y Probabilit
) 008 . 0 )( 512 . 0 (
1 2 3 1) 2 (3
1 2 3 4 5 6
6 of out tests favorable 3 of y Probabilit
2 . 0 8 . 0
! 3!3
6!
6 of out tests favorable 3 of y Probabilit
2 . 0 8 . 0
! 3 - 6 3!
6!
6 of out tests favorable 3 of y Probabilit
! r - n r!
n!
6 of out tests favorable 3 of y Probabilit
6 n
3 r
0.2 q
8 . 0 p
! r - n r!
n!
n trials in successes r of robability
3 3
3 6 3

= = =
/ / /
/ / /
=
=
=
=
=
=
=
=
=

r n r
r n r
q p
q p P
Example:
Five employees are required to operate a chemical process; the process
cannot be started until all 5 work stations are manned. Employee records
indicate there is a 0.4 chance of any one employee being late, and we
know that they all come to work independently of each other.
Management is interested in knowing the probabilities of 0,1,2,3,4, or 5
employees being late, so that a decision concerning the number of back-
up personnel can be made.







( )
( )
( )
( ) ( )
( )
1024 . 0
0768 . 0
2304 . 0
3456 . 0
2592 . 0
) 1296 . 0 )( 4 . 0 (
) 1
) 6 . 0 )( 4 . 0 (
!
6 . 0 4 . 0
!
7776 . 0
) 7776 . 0 )( 1 (
) 1
6 . 0 4 . 0
!
4 .
!
4 1 5 1
0 5 0
P(5) : 5 r
P(4) : 4 r
P(3) : 3 r
P(2) : 2 r
P(1)
1 2 3 4 (
1 2 3 4 5
P(1)
4 1!
5!
1 - 5 1!
5!
P(1)
: 1 r f or
P(0)
1 2 3 4 5 (
1 2 3 4 5
P(0)
0 - 5 0!
5!
P(0)
: 0 r f or
5 n
0.6 q
0 p
q p
r - n r!
n!
employees n of out arrivals late r of robability P
r n r
= =
= =
= =
= =
=
/ / / /
/ / / /
=
= =
=
=
/
/ / / / / /
/ / / / /
=
=
=
=
=
=
=

0
0.1
0.2
0.3
0.4
0.5
0.6
0 1 2 3
Binomial(5, 0.1)
p=0.1
n=5
p=0.3
n=5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0 1 2 3 4 5
Binomial(5, 0.3)
p= 0.4
n=5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 1 2 3 4 5
Binomial(5, 0.4)
p= 0.5
n=5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 1 2 3 4 5
Binomial(5, 0.5)
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
0 1 2 3 4 5
Binomial(5, 0.7)
p=0.7
n=5
p=0.7
n=5
0
0.1
0.2
0.3
0.4
0.5
0.6
0 1 2 3 4 5
Binomial(5, 0.9)
When p is small (0.1), the binomial distribution is
skewed to the right

As p increases (to 0.3), the skewness is less noticeable

When p=0.5, the binomial distribution is symmetrical

When p is larger than 0.5, the distribution is skewed to
the left
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0 1 2 3 4 5
Binomial(5, 0.4)
p=0.4
n=5
p=0.4
n=10
0
0.05
0.1
0.15
0.2
0.25
0.3
0 1 2 3 4 5 6 7 8 9
Binomial(10, 0.4)
p=0.4
n=20
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Binomial(20, 0.4)
As n increases, the vertical lines become more
numerous and tend to bunch up together to form
something like a bell-shape.
Poisson Distribution

A discrete distribution in which the probability of the
occurrence of an event within a small time period is
very small, in which the probability that two or more
such event will occur within the same time interval is
effectively 0, and in which the probability of the
occurrence of the event within one time period is
independent of where that period is.



Poisson Distribution

The probability of exactly x occurrences in a Poisson
distribution is calculated using the formula:





=
!
e
) (
x
x P
x
Probability of exactly x
occurrences
Lambda (the average number of
occurrences per interval of time) raised
to the x power
X factorial
e (2.71828, the base of
the natural logarithm
system) raised to the
negative lambda power
Exponential Distribution

A continuous probability distribution used to describe

Das könnte Ihnen auch gefallen