Sie sind auf Seite 1von 50

IE 4521: Statistics, Quality, and

Reliability
Lecture 1: Introduction

John Gunnar Carlsson


January 19, 2010
Introduction
• Probability is everywhere: Science,
engineering, business, management
science, finance, economics
• Mathematics commonly used to predict
some physical system, but the actual
result may vary
• Probability and statistics assess this
uncertainty quantitatively
“Probability theory is nothing but common sense reduced to calculation.”
--Pierre-Simon Laplace, Essai philosophique sur les probabilités, 1814
January 19, 2010
Reliability
• What does “risk” really mean?
• E.g. how “risky” is this bridge I’m building?
• What is the likelihood that a load on this bridge
exceeds its maximum capacity?
• Methodology: build a mathematical model that
allows us to assess these risk levels
• Eliminating risk of failure is usually impossible, it
is possible to reduce chances of a failure to
acceptable levels
• Probability and statistics gives us the necessary
tools to make these determinations

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability vs. Statistics
• Probability involves developing models to
describe non-deterministic phenomena
(i.e. models with uncertainty)
• Rules → data
• Predictions about future
• Statistics involves testing a probabalistic
model against experimental data
• Data → rules
• Inferences about the past
IE 4521: Probability, Statistics, and Reliability January 19, 2010
Probability vs. Statistics
Coin example:
• We have a fair coin
• Probabilistically, we say the odds are ½
in favor of heads, ½ in favor of tails
• Statistically, we flip the coin a few times,
then draw conclusions
• Is it likely that this coin is fair?

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability or Statistics?
• Zipf’s Law: “In most languages, for some
(mostly unknown) reason, the frequency
of any word is inversely proportional to its
rank in the frequency table”

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability or Statistics?
• Zipf’s Law: “In most languages, for some
(mostly unknown) reason, the frequency
of any word is inversely proportional to its
rank in the frequency table”
• This is a statistical statement, since it’s
an inference about known data
(languages)

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Terminology
• Experiment: Process of measurement or
observation
• Trial: Single performance of an experiment
• Outcome: Result of single trial of an
experiment
• Sample space: Set of all possible outcomes
of an experiment (all elements must be
disjoint); usually notated S
• Event: Subset of elements in a sample
space
IE 4521: Probability, Statistics, and Reliability January 19, 2010
Disjointness
• Suppose we roll a conventional die;
consider two events
• Event A: {1,3,5}
• Event B: {2,4,6}
• Events A and B are disjoint as they have
no common elements

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Set operations
• The union of two sets A and B is written
A  B; consists of all elements in A, B, or
both

S
A B

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Set operations
• The intersection of two sets A and B is
written A  B; consists of the elements
common to A and B

S
A B

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Set operations
• The complement of a set A in set S is
written Ac; consists of the elements in S
that are not in A

S
A

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Example
• Suppose we are rolling a six-sided die,
and hence S = {1,2,3,4,5,6}
• Let A = {1,2,3,4} and B = {1,6}
Then:
• A  B = {1,2,3,4,6}
• A  B = {1}
• Ac = {5,6}
• Bc = {2,3,4,5}
IE 4521: Probability, Statistics, and Reliability January 19, 2010
Venn Diagram

1 2 3 4

6
5

IE 4521: Probability, Statistics, and Reliability January 19, 2010


A couple of comments
• The sets A and B are disjoint if and only if
A  B =  (the empty set)
• By construction, A  Ac = S

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of an event
• Probability: a quantity that characterizes
how frequently an event occurs, in the
limit of infinitely many trials
Number of outcomes favoring A
Pr( A)  lim
n  Total number of outcomes

Here n is the number of trials we perform

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of an event
We often define the probability of an event
by three axioms:
1. 0 ≤ Pr(A) ≤ 1
2. Pr(S) = 1 (any outcome must lie in the
sample space)
3. If A and B are mutually exclusive (disjoint)
events, then Pr(A  B) = Pr(A) + Pr(B)
3a) If A1 ,…, An are mutually exclusive
events, then
Pr(A1  …  An) = Pr(A1) + ··· + Pr(An)

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of non-disjoint events
What if A and B are not mutually
exclusive? We don’t have
Pr(A  B) = Pr(A) + Pr(B)

S
A B

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of non-disjoint events
• Set C = A  B
• Create A*and B* such that A* , B* , and C
are all mutually exclusive and
A*  B*  C = A  B

S
A* C B*

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of non-disjoint events
• We now have
Pr(A  B) = Pr(A*  B*  C)
= Pr(A* ) + Pr(B* ) + Pr(C)

S
A* C B*

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of non-disjoint events
• Note that
Pr(A* ) + Pr(C ) = Pr(A)
Pr(B* ) + Pr(C ) = Pr(B)
• Hence,
Pr(A  B) = [Pr(A) – Pr(C)] + [Pr(B) – Pr(C)] + Pr(C)
Pr(A  B) = Pr(A) + Pr(B) – Pr(A  B)

S
A* C B*

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probability of non-disjoint events

The preceding formula also holds in the


case that A and B are disjoint, in which
case A  B = .

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Example
• Suppose we are rolling a six-sided die,
and hence S = {1,2,3,4,5,6}
• Let A = {1,2,3,4} and B = {1,6}
What is Pr(A  B)?

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Example
• Suppose we are rolling a six-sided die,
and hence S = {1,2,3,4,5,6}
• Let A = {1,2,3,4} and B = {1,6}
What is Pr(A  B)?
• By the preceding formula,
Pr(A  B) = Pr(A) + Pr(B) – Pr(A  B)
So,
Pr(A  B) = 4/6+ 2/6 – 1/6 = 5/6

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Example
• Suppose we are rolling two six-sided die,
and hence S =
(1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
(2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
(3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
(4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
(5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
(6,1) (6,2) (6,3) (6,4) (6,5) (6,6)

(all are equally likely)


IE 4521: Probability, Statistics, and Reliability January 19, 2010
Example
• What is the likelihood of rolling at least
one 2?

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Example
• What is the likelihood of rolling at least
one 2?
(1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
(2,1) (2,2) (2,3) (2,4) (2,5) (2,6)
(3,1) (3,2) (3,3) (3,4) (3,5) (3,6)
(4,1) (4,2) (4,3) (4,4) (4,5) (4,6)
(5,1) (5,2) (5,3) (5,4) (5,5) (5,6)
(6,1) (6,2) (6,3) (6,4) (6,5) (6,6)

11outcomes with a 2
 11/ 36
36 possibleoutcomes
IE 4521: Probability, Statistics, and Reliability January 19, 2010
Permutations
• Sometimes all the elements of an event A
are equally likely to occur
• Randomly selecting a password of 8
characters, failure of 2 identical
computers in a network of 10, etc.
• We can determine probabilities of these
events by enumerating all outcomes

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Permutations
• We are given n distinct objects; we want
to arrange r of them in some order
• A permutation is an arrangement of n
objects, taken r at a time, in a given order
• This is written P(n,r)

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Permutations
• How do we compute P(n,r)?
• Well, we have n choices for the first
element, (n – 1) choices for the second,
etc., until we get to (n - r+1) choices for
the last, so
n!
P (n, r ) 
(n  r )!

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Examples
In how many different ways can a party of
7 people arrange themselves:
• In a row of 4 chairs?

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Examples
In how many different ways can a party of
7 people arrange themselves:
• In a row of 4 chairs?
• P(7,4) = 7!/3! = 7 · 6 · 5 · 4 = 840

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Examples
In how many different ways can a party of
7 people arrange themselves:
• In a row of 7 chairs?

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Examples
In how many different ways can a party of
7 people arrange themselves:
• In a row of 7 chairs?
• P(7,7) = 7!/0! = 5040

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Combinations
• A combination is the same as a
permutation, only order doesn’t matter
• In other words, an arrangement of n
objects, taken r at a time, without regard
to order
• This effectively introduces a factor of 1/r!
to the number of permutations
• The number of combinations is typically
written C(n,r)
IE 4521: Probability, Statistics, and Reliability January 19, 2010
Combinations
• We compute C(n,r) by the formula

n!
C(n, r ) 
(n  r )! r !

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Conditional Probability
• For experiments with more than one
event of interest, we may want to know
the probability of one event, given that we
know that another event occurred
• This motivates us to revise our
information about probabilities

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Conditional Probability
• The conditional probability of event A,
conditioned on event B, is defined as
P( A  B)
P( A | B) 
P (B )
Read “Probability of A given B”

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Conditional Probability
• If A and B are disjoint (mutually
exclusive), then we have
P( A  B) 0
P( A | B)   0
P (B ) P (B )
• If B is contained inside A (we write this as
B  A), then we have
P ( A  B ) P (B )
P( A | B)   1
P (B ) P (B )

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Conditional Probability
• Die example: Let B = {1,3,5} and A = {3}.
Then
1/ 6
P( A | B)   1/ 3
3/6
1/ 6
P (B | A )  1
1/ 6

IE 4521: Probability, Statistics, and Reliability January 19, 2010


Probabilities of Event Intersections
• If we multiply
Pr( A  B )
Pr( A | B ) 
Pr(B )
by Pr(B), we get

Pr( A  B)  Pr( A | B)Pr(B)


This is useful, because sometimes
Pr(A  B) can be difficult to compute from
scratch
IE 4521: Probability, Statistics, and Reliability January 16, 2010
Probabilities of Event Intersections
• This extends naturally to

Pr( A  B  C )  Pr( A)  Pr(B | A)  Pr(C | A  B)

and so on ad infinitum (see the book)

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Example
• I draw two cards from a deck of cards,
without replacing them
• Let B be the event that the first card is a
heart (B = {first card is heart})
• Let A be the event that the second card is
a heart (A = {second card is heart})
• What’s the probability of drawing two
hearts?

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Example
• We compute Pr(A  B) by the formula

Pr( A  B)  Pr( A | B)Pr(B)


• Clearly, Pr(B) is (13 hearts)/(52 cards) =
1/4
• Also, Pr(A|B) = 12/51
• So, Pr(A  B) = (1/4)(12/51) = 3/51

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Comment
• We could also have done this by writing
out the entire sample space, then
counting the number of elements in the
sample space with two hearts
• But that would be arduous

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Independent Events
• Two events A and B are said to be
independent if

Pr( A | B)  Pr( A)

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Independent Events
• Two events A and B are said to be
independent if

Pr( A | B)  Pr( A)
• Intuitively, this means that event B has
nothing to do with event A
• This also implies that Pr(B | A) = Pr(B)
and that Pr(A  B) = Pr(A)Pr(B)

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Independent Events
• Independent events tend to be easier to
deal with, since we don’t have to worry
about conditional probability terms like
Pr(A|B), and so forth
• In practice (i.e. not problem sets), true
independence cannot be proven, but
“assumed”

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Examples
• The values on two different dice rolls
• Flipping two coins
• Casino slot machines
• Basketball hot streaks (Gilovich, Vallone,
& Tversky 1985, still hotly contested)

IE 4521: Probability, Statistics, and Reliability January 16, 2010


Pop quiz
The events A and B are:
a) independent
b) disjoint
c) mutually exclusive
d) (b) and (c)
e) All of the above

S B
A

IE 4521: Probability, Statistics, and Reliability January 16, 2010

Das könnte Ihnen auch gefallen