Sie sind auf Seite 1von 10

Chapter 2: Probability

Sample Space: Denote as

Definition 2.1: The set of all possible outcomes of an experiment is called the Sample Space.

Example 2.1: Toss a coin ------ Head, or Tail, S = {, }

Example 2.2: Roll a dice ------ 1, 2, 3, 4, 5,6, S = {1,2,3,4,5,6}

Example 2.3: Roll two dices together, one red, one green ------
(1,1). (1,2). (1,3). (1,4). (1,5). (1,6)
(2,1). (2,2). (2,3). (2,4). (2,5). (2,6)

(3,1). (3,2). (3,3). (3,4). (3,5). (3,6)
S=
(4,1). (4,2). (4,3). (4,4). (4,5). (4,6)
(5,1). (5,2). (5,3). (5,4). (5,5). (5,6)
(6,1). (6,2). (6,3). (6,4). (6,5). (6.6)
Definition 2.2: An event is a subset of a sample space.

Example 2.4: Let S be the sample space in Example 2.3 and define event B which equals to the total
number of points rolled with the pair of dices is 7. Then we have

B = {(, )| + = 7} = {(1,6). (2,5). (3,4). (4,3). (5,2). (6,1)}

, which is read as B is the subset of A.

means subset.

If , then

S A B, which is read as A intersection B

A
B S A B = { | }
A B S A B, which is read as A union B

A B = { | }

A S = = = { | },

Which is read as A complements

The probability of an event.

Axiom 1: 0 P(A) 1

Axiom 2: P(S) = 1

Axiom 3: If 1 , 2 , is finite or infinite sequence of mutually exclusive (or disjoint) events of a sample
space S, then

P(1 2 3 ) = (1 ) + (2 ) + (3 ) + =
=1 ( )

Theorem 2.1: If A is an event in a discrete Sample space S, then (A) equals the sum of probabilities of
the individual outcomes comprising A. That is,

P(A) = (1 ) + (2 ) + (3 ) +

Recall
Sample Space: Denote as
Event Denote as ,B,C,
Probability of A: Denote as (A), (B),

A
S () =

()

Which is read as A intersection B


Which is read as A union B

= = Which is read as A complements or x is not in A.
A
S

P = ( )
=1 =1
In which are disjoint events

Example 2.5: A die is loaded in such a way, that each odd number is twice as likely to occur as each even
number. Find (), where G is the event that a number greater than 3 occurs on a single roll
of the die.

S = {1,2,3,4,5,6}
P(S) = 1 P(1) = P(3) = P(5) = 2

P(2) = P(4) = P(6) =


1
9 = 1 =
9
= { > 3} = {4,5,6}
4
() = (4) + (5) + (6) =
9

Theorem 2.2: If an experiment can result in any one of different equally likely outcomes, and if of
these outcomes together constitute event , then the probability of the event A is

1 2 3 is the number of elements in event A.


S () =
# ()
() =
# ()
number of elements in Event A
=
number of elements in Sample Space S

A 1 element
a1 , a 2 , a 3 , {1 } set of element
, a
A = {1 } {2 } { }
1 1
P(A) = P{1 } {2 } { } = =1 P( ) = + + =
Some Rule of Probability
Theorem 2.3: If A and are complementary events in a sample space S, then
s P( ) = 1 ().
Proof: By applying Axioms 2 and 3, we
have
A 1 = P(S) = ( ) = () + ( )
( ) = 1 ()

Theorem 2.4:

() =
() = ( ) = () + ()
() =

Theorem 2.5: If A and B are events in a sample space S and


A then P(A) ().

Proof: Note that we can write the event B as


B = A ( ) and hence we have

S P(B) = PA ( )
= P(A) + P( ) ()
Because both P(A) and P( ) are positive number and this
completes the proof.

Theorem 2.6: For any event A, ()


= () () (*)
() () = (**)
Theorem 2.7: (General addition rule) If A and B are any two events in a sample space S, then

S P(A ) = P(A) + P(B) P(A B)

A B

Theorem 2.8: If A, B, and C are any three events in a sample space S, then

P(A ) = P(A) + P(B) + P(C) P(A B) P(B C) P(A C) + P(A B C)

Definition 2.3: If A and B are any two events in a sample space S and () 0, then the conditional
probability of A given B is
(AB)
P(A|B) = .
()

If () 0, then the conditional probability of B given A is


(AB)
P(B|A) = .
()

Theorem 2.9: If A and B are any two events in a sample space S and () 0, then

(A B) = P(A|B) ()
And if () 0, then

(A B) = P(B|A) ()
Example 2.6: There is a die, what is the probability of getting 6 if we know that the number is greater
than 4.
S = {1,2,3,4,5,6} 1
P(of getting 6) =
P(S) = 1 6

P(of getting 6 IF we know that the number is greater than 4)

() 2
P(x > 4) = =
() 6

P(of getting 6 |we know the number is greater than 4)


1
P(x > 4) P(getting 6) 6 1
= = =
P(x > 4) 2 2
6

Theorem 2.10: If A, B, C are any three events in a sample S such that ( ) 0, then

P(A B C) = () (|) (| )
Proof: P(A B C) = P(C|A B) P(A B) = P(C|A B) P(B|A) P(A)

Definition 2.4: (Independent Events) Two events A and B are independent if the occurrence or non-
occurrence of either one does not affect the probability of the occurrence of the other. That is

P(A|B) = P(A)
These are true if we know A and B are independent.
P(B|A) = P(B)
Note that for two in dependent events A and B:
P(AB)
P(A) = P(A|B) = P(B)
and then P(A B) = P(A) P(B) .
Theorem 2.11: Assume and are independent, then


are also independent.

Proof: First we show that and B are independent. In other words, we show that

P( ) = ( ) ().
Note that

B = (B ) ( ) P(B) = P(B ) + ( )


P(B ) = P(B) P(B A) () () ()
= P(B) [1 P(A)]
= P(B) P( )
Next, we prove that and are independent.

P( ) = P( ) ( )

Write

= ( ) P( ) = P[ ] = 1 P(A B)

1 P(A B) = 1 [P(A) + P(B) P(A B)]


= 1 P(A) P(B) + P(A B)
= 1 P(A) P(B) + P(A) P(B)
= 1 P(A) P(B) [1 P(A)]
= [1 P(A)] [1 P(B)]
= P( ) ( )

For 3 events, A, B, C in a Sample Space S, and A, B, C are independent, then

P(A B) = P(A) P(B)


P(B C) = P(B) P(C)
P(A C) = P(A) P(C)
P(A B C) = P(A) P(B) P(C)
Theorem 2.12: Rule of total Probability If the events 1 , 2 , , constitute a partition of the sample
space S and ( ) 0 for i = 1,2, , n, then for any event B in S,

P(B) = (| ) ( )
=1

Orange region is Event


=
note

Proof: Note that


B = (B 1 ) (B 2 ) (B 3 ) (B )
and then

P(B) = P[(B 1 ) (B 2 ) (B 3 ) (B )]
= P(B|1 )(1 ) + P(B|2 )(2 ) + + P(B| )( )

= (| ) ( )
=1

and this completes the proof.


Theorem 2.13 (Bayes Theorem): If 1 , 2 , 3 , , be a portion of the Sample Space S, and P(B )
0, for i=1, 2, 3, k, then for any events A.
(| ) ( )
P( |) =
=1 (| ) ( )

Proof:

( )
P( |)
()

( )= ( ) (| ) ( )

()

(| ) ( ) Orange region is Event B


-------------------------------------
=1 (| ) ( )
()= (| )( ) S = 1 2
=1

P(B) = (| ) ( )
=1
References:
-John E. Freunds Mathematical Statistics with Applications, 8th edition, Miller, I. and Miller, M.

-Probability and Statistics for Engineering and the Sciences, 6th edition, Devore.

Das könnte Ihnen auch gefallen