Sie sind auf Seite 1von 19

Random Variables and Distributions

Waseem A. Malik

ENPM 667: Control of Robotic Systems

Waseem A. Malik (UMD) Random Variables and Distributions 1 / 19


Random Variables and Distributions

It is often the case that when an experiment is performed, we are


mainly interested in some function of the outcome instead of the
outcome itself.

These quantities of interest or more formally these real valued


functions defined on the sample space are known as random variables.
For a random variable X , the function F defined by
F (x) = P(X x) <x <
is called the cumulative distribution function of X .

Thus the distribution function specifies, for all real values of x, the
probability that the random variable is less than or equal to x. It
should be noted that F (x) is a non-decreasing function of x.

Waseem A. Malik (UMD) Random Variables and Distributions 2 / 19


Discrete Random Variables (1)
A random variable that can take on at most a countable number of
possible values is said to be discrete.
For a discrete random variable, we define the probability mass
function p(a) of X by
p(a) = P(X = a)
The probability mass function p(a) is positive for at most a countable
number of values of a. If X assumes one of the values x1 , x2 , . . . then
p(xi ) 0 i = 1, 2, . . .
p(x) = 0 for all other values of x
Since X must take one of the values xi we have that
X
p(xi ) = 1
i=1
Example: The probability mass function of a random variable X is
i
given by p(i) = c , i = 0, 1, 2, . . . where is some positive value.
i!
Waseem A. Malik (UMD) Random Variables and Distributions 3 / 19
Discrete Random Variables (2)
Example: Find (a) P(X = 0) and (b) P(X > 2)

X
Since p(i) = 1 we have that
i=0

X i
c =1
i!
i=0

X i
Now we know that e = . Therefore
i!
i=0
ce = 1 c = e
P(X = 0) = e 0 /0! = e
Similarly solution to part (b) is given by:
P(X > 2) = 1 P(X 2) = 1 P(X = 0) P(X = 1) P(X = 2)
2 e
= 1 e e
2
Waseem A. Malik (UMD) Random Variables and Distributions 4 / 19
Discrete Random Variables (3)

One of the most important concepts in probability theory is that of


the expectation of a random variable.
If X is a discrete random variable with pmf p(x), then the
expectation or expected value of X is given by
X
E [X ] = xp(x)
x:p(x)>0
In other words, the expected value is a weighted average of the
possible values that X can take.
Example: Find E [X ] where X is the outcome when we roll a fair die.
Since p(i) = 1/6, i = 1, 2, . . . , 6 we obtain
           
1 1 1 1 1 1 7
E [X ] = 1 +2 +3 +4 +5 +6 =
6 6 6 6 6 6 2
It should be noted that if X is a discrete random variable that takes
on one of the value xi , i 1 with probabilities p(xi ) then for any
real-valued function g
Waseem A. Malik (UMD) Random Variables and Distributions 5 / 19
Discrete Random Variables (4)

X
E [g (X )] = g (xi )p(xi )
i

As one expects a random variable X to take values around its mean


E [X ], a reasonable way of measuring the variation of X would be to
look at how far apart X would be from its mean on average.
One way to measure this is the expectation of the square of the
difference between X and its mean, called the variance.
If X is a random variable with mean , = E [X ], then the variance
of X is given by
Var (X ) = E [(X )2 ] = E [X 2 ] E [X ]2
Suppose that an experiment whose outcome can be classified as
either a success or a failure is performed.

Waseem A. Malik (UMD) Random Variables and Distributions 6 / 19


Discrete Random Variables (5)

If we let X = 1 when the outcome is a success and X = 0 when it is a


failure, then the pmf of X is given by
p(0) = P(X = 0) = 1 p, p(1) = P(X = 1) = p, 0 p 1

Here p is the probability that the trial is a success. A random variable


X whose pmf is given by the preceding equation for some p (0, 1) is
called a Bernoulli random variable.

Suppose now that n independent trials, each of which results in a


success with probability p and in a failure with probability 1 p are to
be performed.

If X represents the number of successes that occur in n trials then X


is said to be a Binomial random variable with parameters (n, p).

Thus a Bernoulli random variable is just a Binomial random variable


with parameters (1, p).
Waseem A. Malik (UMD) Random Variables and Distributions 7 / 19
Discrete Random Variables (6)

The pmf of a Binomial random variable having parameters (n, p) is


given by  
n i
p(i) = p (1 p)ni , i = 0, . . . , n
i
If X is a Binomial random variable with parameters n and p then
E [X ] = np, Var (X ) = np(1 p)
A random variable X taking on the values 0, 1, 2, . . . is said to be a
Poisson random variable with parameter if for some > 0
i
p(i) = P(X = i) = e , i = 0, 1, 2, . . .
i!
The Poisson random variable has tremendous range of applications
and can also be used as an approximation for a Binomial random
variable.

Waseem A. Malik (UMD) Random Variables and Distributions 8 / 19


Discrete Random Variables (7)
If n is large and p is small and we let = np. Then it can be shown
that for a Binomial (n, p) random variable X
k
 
n k
P(X = k) = p (1 p)k e , k = 0, 1, 2, . . .
k k!
The Poisson random variable appears naturally in many physical
situations. For example, the Poisson pmf gives an accurate prediction
for the relative frequencies of the number of particles emmitted by a
radioactive mass during a fixed time period.
Example: Requests for telephone connections arrive at a telephone
switching office at a rate of calls per second. It is known that the
number of requests in a time period is a Poisson random variable.
Find the probability that there are no call requests in t seconds. Find
the probability that there are n or more requests.
Solution: The average number of call requests in a tsecond period
is = t. Therefore N(t), the number of requests in t seconds in a
Poisson random variable with parameter .
Waseem A. Malik (UMD) Random Variables and Distributions 9 / 19
Discrete Random Variables (8)

Solution: Therefore, we get


(t)0 t
P(N(t) = 0) = e = e t
0!
Similarly
n1
X (t)k
P(N(t) n) = 1 P(N(t) < n) = 1 e t
k!
k=0
It should be noted that the expectation and variance of a Poisson
random variable are both equal to its parameter .

Waseem A. Malik (UMD) Random Variables and Distributions 10 / 19


Continuous Random Variables (1)
We say that X is a continuous random variable if there exists a
non-negative function f , defined for all real x (, ), having the
property that for any set B of real numbers
Z
P(X B) = f (x)dx
B
the function f is called the probability density function of the random
variable X .
This equation states that the probability that X will be in B can be
obtained by integrating the probability density function over the set B.
Since X must assume some value so f must satsify
Z

1 = P X (, ) = f (x)dx

All probability statements about X can be answered in terms of f . If
B = [a, b], we get
Z b
P(a X b) = f (x)dx
a
Waseem A. Malik (UMD) Random Variables and Distributions 11 / 19
Continuous Random Variables (2)

It should be noted that the probability that a continuous random


variable will assume any fixed value is zero. Hence for a continuous
random variable X Z a
P(X < a) = P(X a) = F (a) = f (x)dx

where F (.) is the cummulative distribution function.
Example: Suppose that X is a continuous random variable whose
probability density function is given by:
f (x) = C (4x 2x 2 ), 0 < x < 2
f (x) = 0 otherwise
(a) What is the value of C?
(b) Find P(X > 1) R
Since f is a probability density function we have that f (x)dx = 1
which implies that
Waseem A. Malik (UMD) Random Variables and Distributions 12 / 19
Continuous Random Variables (3)
Z 2
3
C (4x 2x 2 )dx = 1 C =
0 8
Z Z 2
3 1
P(X > 1) = f (x)dx = (4x 2x 2 )dx =
1 8 1 2

We have already defined that if X is a discrete random variable then


its expected value is given by
X
E [X ] = xP(X = x)
x
If X is a continuous random variable with pdf f (x) then
f (x)dx P(x X x + dx) for dx small
It is easy to see that the analogous definition is to define the expected
value of X by Z
E [X ] = xf (x)dx

Waseem A. Malik (UMD) Random Variables and Distributions 13 / 19
Continuous Random Variables (4)

Variance of the random variable X is given by


Var (X ) = E [(X E [X ])2 ] = E [X 2 ] E [X ]2
If a and b are constants then
E [aX + b] = aE [X ] + b, Var (aX + b) = a2 Var (X )
We say that the random variable X is uniformly distributed over the
interval (, ) if its probability density function is given by
 1 
if < x <
f (x) =
0 otherwise
Ra
Since F (a) = f (x)dx we obtain that the distribution function of
a uniform random variable over the interval (, ) is given by

0 a
a
F (a) = <a<
1 a

Waseem A. Malik (UMD) Random Variables and Distributions 14 / 19


Continuous Random Variables (5)

If X is uniformly distributed over (, ) then


+ ( )2
E [X ] = , Var (X ) =
2 12
We say that X is a normal (or Gaussian) random variable with
parameters and 2 if the density function of X is given by
1 2 2
f (x) = e (x) /2 <x <
2 2
This density function is a bell shaped curve that is symmetric about
. The parameters and 2 of a normal random variable represent
its mean and variance.

If X is normally distributed with parameters and 2 then


Y = aX + b, a and b are constants, is also normally distributed with
parameters a + b and a2 2 .

Waseem A. Malik (UMD) Random Variables and Distributions 15 / 19


Continuous Random Variables (6)
If X is a normal random variable with parameters and 2 then
Z = (X )/ is normally distributed with parameters 0 and 1. Such
a random variable is called a standard normal random variable.
It is traditional to write the cumulative distribution function of a
standard normal random variable by (x)
Z x
1 2
(x) = e y /2 dy
2
For negative values of x, (x) can be obtained from the equation
(x) = 1 (x) <x <
Example: If X is a normal random variable with parameters = 3
and 2 = 9. Find P(2 < X < 5)
 
23 X 3 53
P(2 < X < 5) = P < <
3 3 3
 
1 2
=P <Z <
3 3
Waseem A. Malik (UMD) Random Variables and Distributions 16 / 19
Continuous Random Variables (7)
        
2 1 2 1
P(2 < X < 5) = = 1 0.3779
3 3 3 3

A continuous random variable whose probability density function is


given, for some > 0, by
 x 
e if x 0
f (x) =
0 if x < 0
is said to be an exponential random variable with parameter .
The mean and variance of the exponential random variable are given
by 1 and 12
Example: Suppose that the length of a phone call in minutes is an
1
exponential random variable with parameter = 10 . If someone
arrives immediately ahead of you at a public booth, find the
probabiltiy that you have to wait (a) more than 10 minutes, (b)
between 10 and 20 minutes
Waseem A. Malik (UMD) Random Variables and Distributions 17 / 19
Continuous Random Variables (8)

Solution: It should be noted that the cumulative distribution


function F (a) of an exponential random variable is given by
F (a) = 1 e a , a 0
Letting X denote the length of the call made by the person in the
booth we have that the desired probabilities are
(a)
P(X > 10) = 1 F (10) = e 1
(b)
P(10 < X < 20) = F (20) F (10) = e 1 e 2
A random variable X is said to have a gamma distribution with
parameters (, ), > 0 and > 0, if its density function is given by
( x )
e (x)1
() x 0
f (x) =
0 x <0

Waseem A. Malik (UMD) Random Variables and Distributions 18 / 19


Continuous Random Varaibles (9)

Here () is called the Gamma function and is defined by


Z
() = e y y 1 dy
0

The expectation and variance of a Gamma (, ) random variable is


given by

E [X ] = , Var (x) = 2

Waseem A. Malik (UMD) Random Variables and Distributions 19 / 19

Das könnte Ihnen auch gefallen