Sie sind auf Seite 1von 25

Probability and Statistical Analysis Dr.

Fadhil Sahib Al-Moussawi

Chapter Five
Probability and Statistical Analysis

Part 1- Probability Analysis


I- Set Theory

Definitions:
The objects constituting a set are called the elements of the set. Let A be a set
and x be an element of the set A. To describe this statement, we write ∈ 𝐴 ; otherwise,
we write 𝑥 ∉ 𝐴. If the set A is empty (i.e., it has no elements), we denote it by ∅.
If x1, x2, ..., xN are all elements of the set A, we write A={ x1, x2, ..., xN }
If every element of the set A is also an element of another set B, we say that A is a
subset of B, which we describe by writing 𝐴 ⊂ 𝐵 .
If two sets A and B satisfy the conditions 𝐴 ⊂ 𝐵 and 𝐵 ⊂ 𝐴, then the two sets are
said to be identical or equal, in which case we write A = B.

Boolean Operations on Sets


1-Unions and Intersections
The union of two sets A and B (A∪B) is defined by the
set of elements that belong to A or B, or to both.
𝑨 ∪ 𝑩 = {𝒙|𝒙 ∈ 𝑨 𝑶𝑹 𝒙 ∈ 𝑩}
The intersection of two sets A and B (A∩B) is defined
by the particular set of elements that belong to both A and B.
𝑨 ∩ 𝑩 = {𝒙|𝒙 ∈ 𝑨 𝑨𝑵𝑫 𝒙 ∈ 𝑩}
2-Disjoint and Partition Sets
Two sets A and B are said to be disjoint if their
intersection is empty; that is, they have no common
elements.
The partition of a set A refers to a collection of
disjoint subsets A1, A2, ..., AN of the set A, the union of
which equals A; that is,

𝑨 = 𝑨𝟏 ∪ 𝑨𝟐 ∪ 𝑨𝟑 ….∪ 𝑨𝑵

1
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

3-Complements
The set Ac is said to be the complement of the set A, with
respect to the universal set S, if it is made up of all the
elements of S that do not belong to A.
𝑨𝒄 = {𝒙|𝒙 ∈ 𝒅𝒐𝒆𝒔 𝒏𝒐𝒕 𝒃𝒆𝒍𝒐𝒏𝒈 𝒕𝒐 𝑨}

The Algebra of Sets

1- Idempotence property: (𝑨𝒄 )𝒄 = 𝑨


2- Commutative property: A∪B=B∪A
A∩B=B∩A
3- Associative property: A∪(B∪C)=(A∪B) ∪ 𝑪
A∩(B∩C)=(A∩B) ∩ 𝑪
4- Distributive property: A∩(B∪C)=(A∩B) ∪ (𝑨 ∩ 𝑪)
A∪(B∩C)=(A∪B) ∩ (𝑨 ∪ 𝑪)
5- De Morgan’s laws: (𝑨 ∪ 𝑩)𝒄 = 𝑨𝒄 ∩ 𝑩𝒄
(𝑨 ∩ 𝑩)𝒄 = 𝑨𝒄 ∪ 𝑩𝒄

Example 1: Referring to the experiment of tossing a coin twice, let A be the event
“at least one head occurs” and B the event “the second toss results in a tail.” Then
A ={HT, TH, HH}, B = {HT, TT }
𝐴 ∪ 𝐵 = {𝐻𝐻, 𝑇𝐻, 𝐻𝑇, 𝑇𝑇} = 𝑆
𝐴 ∩ 𝐵 = {𝐻𝑇}
𝐴𝑐 = {𝑇𝑇}
𝐴 − 𝐵 = 𝐴 ∩ 𝐵𝑐 = {𝑇𝐻, 𝐻𝐻}

II- Probability Theory:


Definition: Let E be a random experiment such that its sample space S contains a
finite number of sample n(S) of sample points, all of which are equally likely. Let
A be any event i.e. A⊂ 𝑺 with number of sample points n(A). Then the probability
of A denoted by P(A) is defined by:
𝒏(𝑨)
𝑷(𝑨) =
𝒏(𝑺)
3 2
Example 2: In the previous Example P(A)= and P(B)=
4 4
Axioms of Probability
1- Nonnegativity: states that the probability of event A is a nonnegative
number bounded by unity, 𝟎 ≤ 𝐏(𝐀) ≤ 𝟏

2
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

2- Additivity : if A and B are two disjoint events, then the probability of


their union satisfies: 𝑷(𝑨 ∪ 𝑩) = 𝑷(𝑨) + 𝑷(𝑩) . In general ,
𝑷(𝑨𝟏 ∪ 𝑨𝟐 … . .∪ 𝑨𝑵 ) = 𝑷(𝑨𝟏 ) + 𝑷(𝑨𝟐 ) + ⋯ 𝑷(𝑨𝑵 )
3- Normalization: states that the probability of the entire sample space S
is equal to unity, P(S) =1
Properties of probability:
1- The probability of an impossible event is zero. P(∅)=0
2- Complement property: Let Ac denote the complement of
event A. Then
𝐏(𝐀𝐜 ) = 𝟏 − 𝐏(𝐀)
3- Subtractive property: If event A lies within the subspace of
another event B, then P(A)≤ P(B) for A⊂ B and
P(B-A)=P(B)-P(A)
4- Let N disjoint events A1, A2, …, AN satisfy the condition
S=𝑨𝟏 ∪ 𝑨𝟐 ∪ … 𝑨𝑵 then ∑𝑵 𝒊=𝟏 𝑷(𝑨𝒊 ) = 𝟏

For the special case of N equally probable events


𝟏
𝑷(𝑨𝒊 ) = 𝒇𝒐𝒓 𝒊 = 𝟏, 𝟐, … . , 𝑵
𝑵
5- Addition Property: If two events A and B are not disjoint,
then the probability of their union event (Joint Probability)
is defined by:
𝑷(𝑨 ∪ 𝑩) = 𝑷(𝑨) + 𝑷(𝑩) − 𝑷(𝑨 ∩ 𝑩)
If A and B are mutually Exclusive (Disjoint) then:
𝑷(𝑨 ∪ 𝑩) = 𝑷(𝑨) + 𝑷(𝑩)
and
𝑷(𝑨 ∩ 𝑩) = 𝟎

Example 3: A single die is tossed once. Find the probability of a 2 or 5 turning up.
Solution: S={1,2,3,4,5,6}
1
P(1)=P(2)=.....=P(6)=
6
The event that either 2 or 5 turns up is indicated by 2∪5. Therefore
P(2 ∪ 5) = P(2) + P(5)

3
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Conditional Probability:
Suppose we perform an experiment that involves a pair of events A and B. Let
P(A|B) denote the probability of event A given that event B has occurred which is
called the conditional probability of A given B. Assuming that B has nonzero
probability, the conditional probability P(A|B) is defined by
𝑷(𝑨∩𝑩) 𝑷(𝑨∩𝑩)
𝑷(𝑨/ 𝑩) = & 𝑷(𝑩/ 𝑨) =
𝑷(𝑩) 𝑷(𝑨)

where 𝑃(𝐴 ∩ 𝐵) is the joint probability of event A and B also it is written as P(A,B).
Statistical Independence: when the occurrence of A doesn’t depend on the
occurrence of B. Then
P(A,B)=P(A/B)P(B)=P(A)P(B)
Example 4: Suppose two dice are thrown and that the dice are distinguishable. An
outcomes of this experiment is denoted by (m,n), where m and n are the faces of the
dice. Let A and B be : A={m+n=11} and B={n≠5}. Then find P(A), P(B) and
P(A,B).
1 1 11 1
Solution: P(A)=P[(5,6)∪(6,5)]= P[(5,6)]+[(6,5)]=P(5)P(6)+P(6)P(5)= + =
6 6 6 6 18
1 5
P(B)=1-P[(n=5)]=1- =
6 6
1
P(A,B)=P(A∩B)=P[(5,6)]=p(5)p(6)=
36
Example 5: One bag contains 4 white ball and 3 black balls, a second contains 3
white balls and 5 black balls. One ball is drawn from the first bag and placed unseen
in the second bag. What is the probability that a ball drawn from the second bag is a
black?
Solution:
Let B1, B2 and W1 represent the drawing of black ball from bag 1 , a black
ball from bag 2 and a white ball from bag 1 respectively.
P[(𝐵1 ∩ 𝐵2 ) or (𝑊1 ∩ 𝐵2 )]=P(𝐵1 ∩ 𝐵2 )+ 𝑊1 ∩ 𝐵2 )=P(𝐵1 ) 𝐵2 /𝐵1 )+ P(𝑊1 ) 𝐵2 /𝑊1 )
3 6 45 38
= + =
7 9 79 63
B2 𝟑𝟔
P(B1∩B2)=𝟕 𝟗
B1 Bag 2 P(B2/B1)=6/9
P(B1)=3/7 3W,
6B W2 𝟑𝟑
P(B1∩W2)=𝟕 𝟗
Bag 1 P(W2/B1)=3/9
4W,
3B
B2 𝟒𝟓
W1 P(W1∩B2)=𝟕 𝟗
Bag 2 P(B2/W1)=5/9
4W,
P(W1)=4/7 W2 𝟒𝟒
5B P(W1∩W2)=𝟕 𝟗 4
P(W2/W1)=4/9
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Example 6: Suppose the sample space S is the populations of adults in a small town
who have complete the requirements for a college degree. We shall categories them
according to gender and employments status. The data are given in table below. Find
P(Male/Employed) and P(Female/Unemployed).

Employed Unemployed Total


Male 460 40 500
Female 140 260 400
Total 600 300 900
Solution:
𝑃(𝐸∩𝑀) 460/900 23 𝑃(𝑈∩𝐹) 260/900 26
P(M/E)= = = & P(F/U)= = =
𝑃(𝐸) 600/900 30 𝑃(𝑈) 300/900 30

Permutation and Combination


A permutation is an arrangement of all or part of a set of objects for
example consider the three letters a,b, and c. The possible permutations are abc, acb,
bca, cab, and cba. Thus, there are 6 distinct arrangements.

The number of permutation of n objects=n!

The number of permutation of n distinct objects taken k at a time is:

𝐧 𝐧!
𝐏𝐤 = = 𝐧(𝐧 − 𝟏)(𝐧 − 𝟐) … … (𝐧 − 𝐤 + 𝟏)
(𝐧 − 𝐤)!

For example the number of permutations that are possible by taking two letters
at a time from 3 in previous example.
3 3!
P2 = (3−2)! = 6 { ab, ac,bc,ba,ca,cb}

A combination is an arrangement part of a set of objects without regard


to order. The number of combinations of n distinct objects taken k at a time is:
𝐧
𝐧 𝐧 𝐧! 𝐏𝐤
𝐂𝐤 = ( ) = =
𝐤 𝐤! (𝐧 − 𝐤)! 𝐤!

For previous example the combination of the 3 letters taken 2 at a time is C23 =
3
P2
= 3. {ab, ac, bc}
2!

5
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Example 7: A box contains 8 red, 3 white, and 9 blue balls. If 3 balls are drawn at
random without replacement, determine the probability that (a) all 3 are red, (b)
all 3 are white, (c) 2 are red and 1 is white, (d) at least 1 is white, (e) one of each
color is drawn, (f) The balls are drawn in the order red, white, blue.

Solution:
𝐶38 14 8 7 6
a) P(3 are red)= = or P(3 are red)=
𝐶320 285 20 19 18
𝐶33 1 3 2 1
b) P(3 are white)= = or P(3 are white)=
𝐶320 1140 20 19 18
𝐶28 𝐶13 7 8 7 3
c) P(2 red and 1 white)= = 𝑜𝑟 =
𝐶320 95 20 19 18
𝐶18 𝐶13 𝐶19 18 8 3 9
d) P(one of each color is drawn)= = 𝑜𝑟 = 6( )
𝐶320
95 20 19 18
P(one of each color is drawn) 3 8 3 9
e) P(red,white,blue)= = 𝑜𝑟 =
3! 95 20 19 18

III- Random Variables

A random variable is a function that associates a real number with each


element in the sample space. For example, if we flip a coin the possible outcomes
are head (H) and tail (T), so sample space S contains two point labeled H and T.
suppose we defined a function X(s) such that
1 if s = H
X(s)={
−1 if s = T
Mapping the two possible outcomes into two points (±1) on the real line. X is
called random variable (stochastic variable).

Distribution Functions for Random Variables


The cumulative distribution function (CDF), or briefly the distribution
function, for a random variable X is defined by:
𝑭(𝒙) = 𝑷(𝑿 ≤ 𝒙)
where x is real number. The property of F(x) is:
1-F(x) is nondecreasing, F(x1)≤F(x2).
2- Bounded between 0 and 1. lim F(x) = 0 , lim F(x) = 1
x→−∞ x→∞
𝑑𝐹(𝑥)
3-F(x) is continuous from the right, p(x)= .
𝑑𝑥

6
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

1-Distribution Functions for Discrete Random Variables

The distribution function for a discrete random variable X can be obtained


from its probability density function (p(x)) by, for all x in (-∞, ∞),

F(x)=𝐏(𝐗 ≤ 𝐱) = ∑𝐮≤𝐱 𝐩(𝐮)


If X takes on only a finite number of values x1, x2, . . . , xn, then the distribution
function is given by:
𝟎 − ∞ < 𝑥 < 𝐱𝟏
𝐩(𝐱 𝟏 ) 𝐱𝟏 ≤ 𝐱 < 𝐱𝟐
𝐅(𝐱) = 𝐩(𝐱 𝟏 ) + 𝐩(𝐱 𝟐 ) 𝐱𝟐 ≤ 𝐱 < 𝐱𝟑

{𝐩(𝐱 𝟏 ) + ⋯ … + 𝐩(𝐱 𝐧 ) 𝐱𝐧 ≤ 𝐱 < ∞

where the function p(x) has the properties:


1- p(x)≥ 0
2- ∑∞ i=−∞ p(xi ) = 1

Example 8: Suppose that a coin is tossed twice so that the sample space is S= {HH,
HT, TH, TT}. Let X represent the number of heads that can come up. With each
sample point we can associate a number for X as shown in Table. Thus, for example,
in the case of HH (i.e., 2 heads), X =2 while for TH (1 head), X = 1. It follows that
X is a random variable.

Sample Point HH HT TH TT
X 2 1 1 0
To find the probability density function p(x):
P(HH)=P(HT) =P(TH)=P(TT)=1/4 (equipropable )
P(X=0)=P(TT)=1/4
P(X=1)=P(HT)+P(TH)=1/2
P(X=2)=P(HH)=1/4

X 0 1 2
p(x) ¼ 1/2 1/4

To find the distribution function F(x):

7
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

0 −∞<𝑥 <0
0.25 0≤x<1
F(x) = {
0.75 1≤x<2
1 2≤x<∞

2- Distribution Function for Continuous Random Variables


𝒙
𝑭(𝒙) = 𝑷(𝑿 ≤ 𝒙) = ∫ 𝒑(𝒖)𝒅𝒖
−∞
𝒅
P(x)= 𝑭(𝒙)
𝒅𝒙
where the function p(x) has the properties:
1- p(x)≥ 0

2- ∫−∞ p(x)dx = 1
Then the probability that X lies between two different values, say, a and b, is given
by:
𝐛 𝐛 𝐚
𝐏(𝐚 ≤ 𝐗 ≤ 𝐛) = ∫ 𝐩(𝐱)𝐝𝐱 = ∫ 𝐩(𝐱)𝐝𝐱 − ∫ 𝐩(𝐱)𝐝𝐱 = 𝑭(𝒃) − 𝑭(𝒂)
𝐚 −∞ −∞

Example 9: A continuous random variable X has pdf :


2
p(x) = {kx 0 < 𝑥 < 3 . Find: a) the constant k. b) CDF F(x). c) P(1<x<2).
0 otherwise
∞ 3
Solution: a) ∫−∞ 𝑝(𝑥)𝑑𝑥 = 1 → ∫0 𝑘𝑥 2 𝑑𝑥 = 1

8
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

3
x3 1
k [ ] = 1 → k(9) = 1 → k =
3 0 9

b) if x<0, F(x)=0
𝑥 𝑥 1 𝑥3
if 0 ≤ 𝑥 < 3 𝐹(𝑥) = ∫−∞ 𝑝(𝑢)𝑑𝑢 = ∫−∞ 𝑢2 𝑑𝑢 =
9 27
if x≥3 , F(x)=1
8 1 7
c) P(1<x<2)=F(2)-F(1)= − =
27 27 27

kx 0<𝑥<1
HW1: Repeat above example for p(x) = {
0 otherwise

Example 10: The distribution function for a random variable X is 𝐹(𝑥) =


−2𝑥
{1−𝑒 𝑥 ≥ 0 .Find a) the density function b)the probability that X>2 and
0 𝑥<0
c) the probability that -3<x≤ 4.
−2𝑥
𝐹(𝑥) = { 2𝑒 𝑥>0
𝑑
Solution: a) p(x)=
𝑑𝑥 0 𝑥<0
∞ −2𝑢 −4
b) P(X>2)=∫2 2𝑒 𝑑𝑢 = 𝑒
4 4
c)P(-3<x≤ 4)= ∫−3 2𝑒 −2𝑢 𝑑𝑢 = ∫0 2𝑒 −2𝑢 𝑑𝑢 = 𝐹(4) − 𝐹(0) = 1 − 𝑒 −8

Joint Distribution
1- Discrete case:
If one experiment has the possible outcomes xi, i=1,2,…,n and the second
experiment has the possible outcomes yj, j=1,2,…,m then the combined experiment
has the possible joint outcomes (xi,yj). Associated with each joint outcomes (xi,yj) is
the joint probability P(xi,yj) which satisfies the condition 0≤ P(xi,yj) ≤1.

y1 y2 ym
x1 p(x1 , y1 ) p(x1 , y2 ) p(x1 , ym )
P(X, Y) = x2 p(x2 , y1 ) p(x2 , y2 ) p(x2 , ym )

xn [p(xn , y1 ) p(xn , y2 ) p(xn , ym )]

p(xi)=∑mj=1 p(xi , yj ) i=1,…,n


p(yj)=∑ni=1 p(xi , yj ) j=1,…,m

9
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

n
m
∑∑ p(xi , yj ) = 1
j=1
i=1
n m

∑ p(xi ) = 1 and ∑ p(yj ) = 1


i=1 j=1
The joint distribution function: F(x,y)=𝑷(𝑿 ≤ 𝒙, 𝒀 ≤ 𝒚) = ∑𝒖≤𝒙 ∑𝒗≤𝒚 𝒑(𝒖, 𝒗)

Example 11: The joint probability function for the random variables X and Y is
given in Table below. Find a) the marginal probability function of X and Y. b)
P(Y/X) and P(X/Y). c) P(𝑋 ≥ 2, 𝑌 ≤ 2)
y
1 2 3
0.05 0.05 0.1
P(X, Y) = 12
x [0.05 0.1 0.35]
3 0 0.2 0.1
Solution:
𝑝(𝑥 ) 𝑝(𝑥2 ) 𝑝(𝑥3 )
a) p(xi)=∑mj=1 p(xi , yj ) 𝑃(𝑋) = ( 1 )
0.2 0.5 0.3
𝑝(𝑦 ) 𝑝(𝑦2 ) 𝑝(𝑦3 )
p(yj)=∑ni=1 p(xi , yj ) 𝑃(𝑌) = ( 1 )
0.1 0.35 0.55
b)

0.05 0.05 0.1 0.05 0.05 0.1


0.2 0.2 0.2 0.1 0.35 0.55
P(X,Y) P(X,Y)
P(Y/X)= = 0.05 0.1 0.35 , P(X/Y)= = 0.05 0.1 0.35
P(X) P(Y)
0.5 0.5 0.5 0.1 0.35 0.55
0.2 0.1 0.2 0.1
[ 0 ] [ 0 ]
0.3 0.3 0.35 0.55

c) P(𝑋 ≥ 2, 𝑌 ≤ 2)=∑3𝑖=2 ∑2𝑗=1 𝑝(𝑥𝑖 , 𝑦𝑗 ) = 0.05 + 0.1 + 0.2 + 0 = 0.35

HW2: The joint probability density function of two discrete random variables X and
Y is given by p(x, y)=c(2x+y), where x and y can assume all integers such that 0 ≤
x≤ 2, 0≤ y ≤ 3, and p (x, y)= 0 otherwise. a) Find the value of the constant c. (c)
Find P(X ≥1, Y ≤2). (b) Find P(X =2, Y= 1).

10
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

2- Continuous Case.

The joint probability function for the random variables X and Y (or, as it is more
commonly called, the joint density function of X and Y ) is defined by
1- P(x,y)≥ 0
∞ ∞
2- ∫−∞ ∫−∞ p(x, y)dx dy = 1

The probability that X lies between a and b while Y lies between c and d is:
𝐛 𝐝
𝐏(𝐚 < 𝑋 < 𝑏, 𝑐 < 𝑌 < 𝑑) = ∫ ∫ 𝐩(𝐱, 𝐲)𝐝𝐱𝐝𝐲
𝐱=𝐚 𝐲=𝐜

The joint distribution function of X and Y in this case is defined by:


𝐱 𝐲
𝐅(𝐱, 𝐲) = 𝐏(𝐗 ≤ 𝐱, 𝐘 ≤ 𝐲) = ∫ ∫ 𝐩(𝐮, 𝐯)𝐝𝐮 𝐝𝐯
𝐮=−∞ 𝐯=−∞
𝝏𝟐
p(x,y)= 𝑭(𝒙, 𝒚)
𝝏𝒙𝝏𝒚
𝐱 ∞
𝐏(𝐗 ≤ 𝐱) = 𝐅𝟏 (𝐱) = ∫ ∫ 𝐩(𝐮, 𝐯) 𝐝𝐮 𝐝𝐯
𝐮=−∞ 𝐯=−∞
∞ 𝐲
𝐏(𝐘 ≤ 𝐲) = 𝐅𝟐 (𝐲) = ∫ ∫ 𝐩(𝐮, 𝐯) 𝐝𝐮 𝐝𝐯
𝐮=−∞ 𝐯=−∞
∞ ∞
𝐩(𝐱) = ∫𝐯=−∞ 𝐩(𝐱, 𝐯)𝐝𝐯 & 𝐩(𝐲) = ∫𝐮=−∞ 𝐩(𝐮, 𝐲)𝐝𝐮

Independent Random Variables:


If the events X =x and Y =y are independent events for all x and y, then we say
that X and Y are independent random variables. In such case,

P(X=x,Y=y)=P(X=x)P(Y=y)
p(x,y)=p(x)p(y)
F(x, y) =F1(x)F2(y)
Example 12: The joint density function of two continuous random variables X and
Y is:
𝑐𝑥𝑦 0 < 𝑥 < 4, 1 < 𝑦 < 5
𝑝(𝑥, 𝑦) = {
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find (i) the constant c
(ii)P(1<x<2,2<y<3) iii) P(X ≥ 3, 𝑌 ≤ 2)
iv) The marginal distribution function F1(x), F2(y)

11
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Solution:
4 5
∞ ∞ 4 4 x2 y2
i) ∫−∞ ∫−∞ p(x, y)dx dy = 1 → ∫0 ∫1 cxy dx dy = 1 → c [ ] [ ] = 1
2 2 0 1

16 24 1
c =1→c=
2 2 96
2 3
2 3 1 1 x2 y2 1 35 5
ii) P(1<x<2,2<y<3)=∫1 ∫2 𝑥𝑦 𝑑𝑥 𝑑𝑦 = [ ] [ ] = =
96 96 2 1 2 2 96 2 2 128
iii) HW
x ∞ x 5 uv x2
iv) F1(x)=P(X≤ 𝑥)= ∫u=−∞ ∫v=−∞ p(u, v) du dv = ∫u=0 ∫v=1 du dv =
96 16
∞ y 4 y uv y2 −1
F2(y)=P(Y≤ 𝑦)= ∫u=−∞ ∫v=−∞ p(u, v) du dv = ∫u=0 ∫v=1 du dv =
96 24

IV- Mathematical Expectation


A- Discrete Random Variable
1. Expectation (mean) of X.
E(X)=X̅ = μx = ∑ni=1 xi p(xi )
2. The mean square value of X.
n
̅̅̅
X 2 = E(X 2 ) = ∑ xi 2 p(xi )
i=1
3. The Variance and Standard Deviation of X.
n

Var(X) = σ2x = E[(x − μx )2 ] = ∑(xi − μx )2 p(xi )


i=1
= E(X 2 ) − (μx )2
The standard deviation σx = √σ2x
3- The covariance of X and Y that have the joint pdf P(X,Y) is:
σXY = E[(X − μx )(Y − μy )] = ∑ ∑(xi − μx )(yj − μy )p(xi , yj )
i j
= E(XY) − μx μy

where E(XY) = ∑ ∑ xi yj p(xi , yj )


i j

12
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Example 13: Let the random variable X represents the number of automobiles that
are used for different business purpose on any given workday.
xi 1 2 3
p(xi) 0.3 0.4 0.3
Find: a) 𝜇𝑥 b)𝑋 ̅̅̅̅2 c) 𝜎𝑥2

Solution:
a) 𝜇𝑥 = ∑3𝑖=1 𝑥𝑖 𝑝(𝑥𝑖 ) = (1)(0.3) + (2)(0.4) + (3)(0.3) = 2
b) ̅̅̅̅
𝑋 2 = ∑3𝑖=1 𝑥𝑖2 𝑝(𝑥𝑖 ) = (1)(0.3) + (4)(0.4) + (9)(0.3) = 4.6
c) 𝜎𝑥2 = ̅̅̅̅
𝑋 2 − 𝜇𝑥2 = 4.6 − 22 = 0.6

Example 14: The joint probability function for the random variables X and Y is
y
0 1 2
P(X, Y) = x 0 3/28 9/28 3/28
1 [3/14 3/14 0 ]
2 1/28 0 0
Find 𝜇𝑥 , 𝜇𝑦 , 𝐸(𝑋𝑌), 𝜎𝑋𝑌 .
Solution:
The marginal density function P(X) and P(Y) are
15 6 1 10 15 3
P(X)=[ ] P(Y)=[ ]
28 14 28 28 28 28
3 15 6 1 1
μx = ∑i=1 xi p(xi ) = (0) ( ) + (1) ( ) + (2)( )=
28 14 28 2
10 15 3 3
μy = ∑3j=1 yj p(yj ) = (0) ( ) + (1) ( ) + (2)( )=
28 28 28 4
3
3
E(XY) = ∑ ∑ xi yj p(xi , yj )
i=1
j=1
=(0)(0)(3/28)+(1)(0)(3/14)+(2)(0)(1/28)+(0)(1)(9/28)+ (1)(3/14)+(2)(1)(0)
+(0)(2)(3/28)+(1)(2)(0)+(2)(2)(0) =3/14
3 3 −9
𝜎𝑥𝑦 = 𝐸(𝑋𝑌) − 𝜇𝑥 𝜇𝑦 = − =
14 8 56

B- Continuous Random Variable


1- Expectation (mean) of X.
E(X)=X̅ = μx = ∫∞ x p(x)dx
−∞
2- The mean square value of X.

̅̅̅
X 2 = E(X 2 ) = ∫ x 2 p(x)dx
−∞
3- The Variance and Standard Deviation of X.

13
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi


Var(X) = σ2x = E[(x − μx )2 ] = ∫ (x − μx )2 p(x)dx
−∞
= E(X 2 ) − (μx )2
The standard deviation σx = √σ2x
4- The covariance of X and Y that have the joint pdf P(X,Y) is:
∞ ∞
σXY = E[(X − μx )(Y − μy )] = ∫ ∫ (x − μx )(y − μy )p(x, y)dx dy
−∞ −∞
= E(XY) − μx μy
∞ ∞
where E(XY) = ∫−∞ ∫−∞ xy p(x, y)dx dy
p(x)
Example 15: If X is a continuous random variable k
having pdf as shown. Find a) the constant k
b) P(X>1) c) X, ̅ ̅̅̅
X 2 , σ2x , standard deviation.
x
∞ -2 -1 0 1 2
Solution: a) ∫−∞ 𝑝(𝑥)𝑑𝑥 = 1
1 1 1
Area=1= ( ) (1)(k)+(2)(k)+( ) (1)(k) → k =
2 2 3
∞ ∞1 1
b) P(X>1)=∫1 p(x)dx = ∫1 (2 − x)dx =
3 6
∞ −1 1 1 1 21
c) ̅X = ∫−∞ xp(x)dx = ∫−2 x(2 + x)dx + ∫−1 xdx + ∫1 x(2 − x)dx +=0
3 3 3
̅̅̅ ∞ ∞ 1 1 2 1 5
X 2 = ∫−∞ x p(x)dx = 2 ∫0 x p(x)dx = 2 [∫0 x dx + ∫1 x 2 (2 − x)dx]=
2 2 2
3 3 6
5 5
σ2 = ̅̅̅ ̅ )2 =
X 2 − (X and σ = √
6 6

Example 16: Let X and Y be random variables having joint density function
x+y 0 ≤ x ≤ 1, 0 ≤ y ≤ 1 ̅ ̅
p(x, y) = { find X, Y , σ2x , σ2y , σXY
0 otherwise
Solution:
1 1
1
y2
p(x) = ∫ (x + y)dy = x[y]0 + [ ] = x + 0.5
y=0 2 0
1 1
1
x2
p(y) = ∫ (x + y)dx = y[x]0 + [ ] = y + 0.5
x=0 2 0
1 1
7 7
̅
μx = X = ∫ x(x + 0.5)dx = ̅
& μy = Y = ∫ y(y + 0.5)dy =
0 12 0 12
1 1
̅̅̅ 2 (x
10 ̅̅̅ 10
2
X =∫ x + 0.5)dx = & Y = ∫ y 2 (y + 0.5)dy =
2
0 24 0 24
14
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

2
σ2x = ̅̅̅ ̅)2 = 10 − ( 7 ) =0.0764=σ2y
X 2 − (X
24 12
∞ ∞ 1 1
1
E(XY) = ∫ ∫ xy p(x, y) dx dy = ∫ ∫ xy (x + y)dx dy =
−∞ −∞ 0 0 3
1 7 7
σXY = E(XY) − μx μy = − = −0.0069
3 12 12
−(𝑥+𝑦)
HW: Repeat Example 16 for 𝑝(𝑥, 𝑦) = {𝑒 𝑥 ≥ 0, 𝑦 ≥ 0
0 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

V-Special Probability Distributions


A- Discrete Probability Distribution

1)-Discrete Uniform Distribution


The random variables x1, x2, …,xN are equal probabilities with
𝟏
p(xi)= i=1,2,…,N
𝑵

Example 17: Fair Die


1
p(xi)= i=1,2,…,6
6

6 6
1
𝜇 = ∑ 𝑥𝑖 𝑝(𝑥𝑖 ) = ∑ 𝑥𝑖 = 3.5
6
𝐼=1 𝐼=1
6 6
1
𝜎 2 = ∑(𝑥𝑖 − 𝜇)2 𝑝(𝑥𝑖 ) = ∑(𝑥𝑖 − 3.5)2 = 2.92
6
𝐼=1 𝐼=1

2)-Binomial Distribution
A Bernoulli trial can results in a success with probability p and a failure with
probability q=1-p. The probability distribution of the binomial random variable X
with the number of successes in (N) independent trial, is:
𝐍
b(x;N,p)=( ) 𝐩 𝐱 𝐪𝐍−𝐱 x=0,1,2,..,N
𝐱
N N!
where ( ) =
x x!(N−x)!

15
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

The binomial distribution derives its name from the fact that N+1 terms in the
binomial expansion of (p+q)N correspond to the varies values of b(x;N,p) for
x=0,1,2,…,N
N N N N
(p+q)N=( ) qN + ( ) pqN−1 + ( ) p2 qN−2 + ⋯ + ( ) pN
0 1 2 N
= b(0; N, p) + b(1; N, p) + b(2; N, p) + ⋯ + b(N; N, p)

Since p+q=1,
𝑁

∑ b(x; N, p) = 1
𝑥=0

Properties of the Binomial Distribution:


Mean, 𝛍 =Np
Variance, 𝛔𝟐 =Npq

Example 18: Find the probability in tossing a fair coin three times, there will appear
a) 3 H b)2 H 1 T c) 2 T and 1 H d) 3 T.
Solution:
p=P(H)=1/2
q=P(T)=1-p=1/2
3 1 3 1 0 1
a) P(3H)=b(3;3,1/2)= ( ) ( ) ( ) =
3 2 2 8
3 1 2 1 1 3
b) P(2H1T)=b(2,3,1/2)= ( ) ( ) ( ) =
2 2 2 8
3 1 1 1 2 3
c) P(2T1H)=b(1,3,1/2)= ( ) ( ) ( ) =
1 2 2 8
3 1 0 1 3 1
d) P(3T)=b(0,3,1/2)= ( ) ( ) ( ) =
0 2 2 8

3)-Poisson Distribution
The Poisson distribution can be obtained from the binomial distribution when N→
∞ and p→ 0.
𝐞−𝛍 (𝛍)𝐱
𝐩(𝐱) =
𝐱!
where 𝜇 = 𝑁𝑝 and e=2.71828
Example 19: A machine is known to produce 1% defective components. The
components are packed in boxes containing 100 components. Using Poisson
distribution to find the probability that each box will contain a) 0,1, or 2 defective
b) more than 2 defective
16
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Solution: N=100 p=0.01 𝜇 = 𝑁𝑝 = 1


e−1 (1)0
a) P(0)=p(no defective)= =0.3679
0!
e−1 (1)1
P(1)=p(1 defective)= =0.3679
1!
e−1 (1)2
P(2)=p(2 defective)= =0.1839
2!
b) P(>2 defective)=p(3)+p(4)+p(5)+…+p(100)=1-[p(0)+p(1)+p(2)]
= 0.0802
B-Continuous Probability Distribution

1-Uniform Distribution
p(x)
The density function of continuous random variable
X on the interval [A,B] is: 1
𝐵−𝐴
𝟏
𝐩(𝐱) = {𝐁 − 𝐀 𝐀≤𝐱≤𝐁
x
𝟎 𝐨𝐭𝐡𝐞𝐫𝐰𝐢𝐬𝐞 A B
The mean and variance of uniform distribution are:
𝐵 1 𝐴+𝐵 F(x)
E(x)=𝑋̅ = 𝜇 = ∫𝐴 𝑥 𝑑𝑥 =
𝐵−𝐴 2
̅̅̅̅ 𝐵 1 1
2 2
E(x )= 𝑋 = ∫ 𝑥 𝑑𝑥 = (𝐴2 + 𝐵2 + 2𝐴𝐵)
2
𝐴 𝐵−𝐴 1
3
2
(𝐵 − 𝐴)
𝜎 2 = 𝐸(𝑥 2 ) − (𝐸(𝑋))2 =
12
The CDF , F(x)=∫𝐴
𝑥 1
𝑑𝑢 =
1
(𝑥 − 𝐴) x
𝐵−𝐴 𝐵−𝐴
A B
2-Normal (Gaussian) Distribution

The density function of normal distribution random variable X with mean 𝜇 and
variance 𝜎 2 is given by:
𝟏 𝐱−𝛍 𝟐
𝟏 − ( )
𝐩(𝐱) = 𝐞 𝟐 𝛔 , -∞ < 𝒙 < ∞
√𝟐𝛑 𝛔

Figures below show the normal distribution for different centers 𝜇 and
different 𝜎

17
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

1 u−μ 2
𝑥 1 − ( )
The CDF, F(x)=∫−∞ e 2 σ du
√2π σ

The integral is not evaluated in closed form and required numerical evaluation.
u−μ du
Let z= → dz=
σ σ
𝑥−𝜇
𝑢 → −∞ 𝑧 → −∞ 𝑎𝑛𝑑 𝑢 → 𝑥 𝑧 →
𝑥−𝜇
𝜎
1 2
1 − z 𝑥−𝜇 𝒙−𝝁
F(x)= ∫−∞𝜎 e 2 dz = 𝑒𝑟𝑓( ) → 𝑭(𝒙) = 𝐞𝐫𝐟 ( )
2π√ 𝜎 𝝈

Where erf(.) is the tabulated errorfunction.

Note : The shaded region represent the area


under the standard normal distribution with 𝜇 =
0 and 𝜎 = 1 :
𝑥 2
𝑧 1
erf(z)=∫−∞ 𝑒− 2 𝑑𝑥
√2𝜋

Example 20: A random variable X has a Gaussian distribution. The mean value of
X is 2 and the variance is 4 volts. Compute the following probabilities:
a) P(X<6) b) P(X>3) c) P(X<-2) d) P(2<X<3)

Solution:
1 𝑥−𝜇 2
6 1 − ( ) 6−2
a) P(X<6)=∫−∞ 𝑒 2 𝜎 𝑑𝑥=F(6)=erf ( )=erf (2)=0.9772
√2𝜋 𝜎 2
1 𝑥−𝜇 2
∞ 1 − ( ) 3−2
b) P(X>3)= ∫3 √2𝜋 𝜎 𝑒 2 𝜎 𝑑𝑥=1-F(3)=1-erf (
2
)=1-erf (0.5)=1-0.6915
=0.3085
1 x−μ 2
−2 1 − ( ) −2−2
c) P(x<-2)= e
∫−∞ √2π σ 2 σ dx = F(−2) = 𝑒𝑟𝑓 ( ) = 𝑒𝑟𝑓(−2)
2
1 x−μ 2
3 1 − ( ) 3−𝜇 2−𝜇
d) P(2<X<3)= ∫2 √2π σ e 2 σ dx = F(3)-F(2)=er f (
𝜎
)-erf (
𝜎
)

18
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

= erfc (0.5)-erf (0)=0.1915

19
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

20
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Part 2: Statistical Data Analysis


Population and Sample. Statistical Inference
Often in practice we are interested in drawing valid conclusions about a large
group of individuals or objects. Instead of examining the entire group, called the
population, which may be difficult or impossible to do, we may examine only a
small part of this population, which is called a sample. We do this with the aim of
inferring certain facts about the population from results found in the sample, a
process known as statistical inference. The process of obtaining samples is called
sampling.
For example We may wish to draw conclusions about the heights (or weights)
of 12,000 adult students (the population) by examining only 100 students (a sample)
selected from this population.
The number of population is called the population size, usually denoted by N.
Similarly the number in the sample is called the sample size, denoted by n, and is
generally finite, for example above N=1200 and n=100.

Let us consider a situation where we select randomly and independently n


samples from N population whose distribution has mean 𝜇 and variance 𝜎 2 . The
set of such samples are referred as random sample.

The Sample Mean: Let x1, x2, . . . , xn denote the independent, identically
distributed, random variables for a random sample of size n. Then the mean of the
sample or sample mean is a random variable defined by:
∑𝐧𝐢=𝟏 𝐱 𝐢
𝛍=𝐗= ̅
𝐧
The Sample Variance: The variance of the sample or the sample variance is
defined:
𝟐
∑𝐧𝐢=𝟏(𝐱 𝐢 − 𝛍)𝟐
𝐒 =
𝐧
Frequency Distributions: If a sample (or even a population) is large, it is difficult
to observe the various characteristics or to compute statistics such as mean or
standard deviation. For this reason it is useful to organize or group the raw data. We
arrange the data into classes or categories and determine the number of individuals
belonging to each class, called the class frequency. The resulting arrangement is
called a frequency distribution or frequency table.
Relative Frequency Distributions: we recorded the relative frequency or
percentage rather than the number of elements in each class, the result would be a
relative, or percentage, frequency distribution.
21
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Example: In Table below the weights of 40 male students at State University are
recorded to the nearest pound. Construct a frequency distribution and find the mean
and variance value.

Solution:
Lower=119 (minimum value).
Upper=176 (maximum value).
n=40 (number of sample)
Range=176-119=57 (range spanned by the data)
Number of class=5
Class width=57/5=11 (approximately)
Lower class limit Center Upper class limit Tally Frequency
118.5 119 124 129 129.5 |||| 4
129.5 130 135 140 140.5 ||||| |||| 9
140.5 141 146 151 151.5 ||||| ||||| |||| 14
151.5 152 157 162 162.5 ||||| || 7
162.5 163 169.5 176 176.5 ||||| | 6

22
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

Tutorial Sheet
1- An Electrical system consists of four components as illustrated below. The
system works if components A and B work and either of the components C or D
work. The probability of working of each component is also shown. Find the
probability that a) the entire system works and b) the component C does not work,
given the entire system works. Assume that four components work
independently.
0.8
0.9 0.9 C
A B
0.8
D

2- The cumulative distribution function for a certain random variable X is shown.


Determine (a) p(x) (b) P(X≤ 2) (c) P(1<X≤3)
𝑥2

3- The Rayleigh pdf is p(x)=𝑥𝑒 𝑢(𝑥). Determine (a) DCF F(x) (b) P(X≤ 2) (c)
2

Determine the constant a such that P(X>a)=0.01.

4- Suppose that X and Y have the following joint probability distribution

23
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

y
2 4
1 0.1 0.15
P(X, Y) = x3 [0.2 0.3 ]
5 0.1 0.15
a) Evaluate the marginal distribution of X and Y.
b) Find P(Y/X) and P(X/Y).
c) Find P(Y=2/X=3).
d) Find μx , μy , σ2x , σ2y and σXY .

5- Let X and Y be random variables having joint density function


c(2x + y) 0 < 𝑥 < 1, 0 < 𝑦 < 2
p(x,y)={
0 otherwise
Find (a) the constant c (b) P(X>0.5,Y>1.5). (c) p(x) and p(y).
6- The joint density function of two random variables X and Y is given by:
xy
0 < x < 4,1 < y < 5
p(x, y) = {96
0 otherwise
Find E(X), E(Y), E(XY), E(2X+3Y)
p(x)
7- Consider the triangular pdf shown. Determine
b
(a) the constant b in terms of a. (b) P(X>a/2).
𝑎 𝑎
(c) P( ≤ 𝑋 ≤ ). (d) X, ̅ ̅̅̅
X 2 and σ2 . x
3 2
-a a
8- Find the mean value and standard deviation of the random variable X described
by the following pdf:
1 1
a) p(x)= δ(x − 1) + δ(x − 3)
2 2
b) p(x) = k[u(x − 1) − u(x − 3)]
x
c) p(x) = ke−2 u(x)

9- Given a normally distributed variable X with mean 18 and standard deviation 2.5,
find:
(a) P(X<5).
(b) The value of k such that P(X<k)=0.2236.
(c) The value of k such P(X>k)=0.1814.
(d) P(17<X<21).

10. In the following table, the values of 40 resistors are recorded in ohms.
(a) Construct the frequency distribution table using number of class=5.

24
Probability and Statistical Analysis Dr. Fadhil Sahib Al-Moussawi

(b) Plot the histogram of frequency table.

45 61 25 64 28 50 32 50

40 46 48 58 35 35 47 40

54 68 19 26 65 63 76 38

54 68 53 73 44 36 47 42

49 38 56 45 57 44 42 35

25

Das könnte Ihnen auch gefallen