Sie sind auf Seite 1von 3

Random Variable

A random variable X is a rule that assigns a numerical value to each outcome in the
sample space of an experiment.
A discrete random variable can take on specific, isolated numerical values, like the
outcome of a roll of a die, or the number of dollars in a randomly chosen bank account.
A continuous random variable can take on any values within a continuum or an interval,
like the temperature in Central Park, or the height of an athlete in centimeters.
Discrete random variables that can take on only finitely many values (like the outcome of
a roll of a die) are called finite random variables.
Example
1. Finite Random Variable
In an experiment to simulate tossing three coins, let X be the number of heads showing
after each toss. X is a finite random variable that can assume the three values: 0, 1, 2, and
3.

Coin 1: Coin 2: Coin 3:


Value of X:

2. Infinite Discrete Random Variable


Roll a die until you get a get a 6; X = the number of times you roll the die.
The possible values for X are 1, 2, 3, 4, ... (If you are extremely unlucky, it might take
you a million rolls before you get a 6!)
3. Continuous Random Variable
Measure the length of an object; X = its length in cm.
http://people.hofstra.edu/stefan_waner/RealWorld/Summary7.html

Variance and Standard Deviation of a Random Variable


If X is a random variable, its variance is defined to be

2 = E( [X ]2 ).
Its standard deviation is defined to be the square root of the variance. An alternate
formula for the variance, useful for calculation, is
2 = E(X2) 2.
The variance and standard deviation of a random variable are the sample variance and
sample standard deviation we expect to get if we have a large number of X-scores.
Conversely, if all we know about X is a collection of X-scores, then the sample variance
and sample standard deviation of those scores are our best estimates of the variance and
standard deviation of X.
Example
Let us look again at the experiment in which we toss an unfair coin, with p = P(heads) =
0.8 and q = P(tails) = 0.2, three times. (X = number of heads.) Here is the distribution
with the x2 scores added.
x
2

Probability

0.008 0.096 0.384 0.512

We saw above that = 2.4. Further,


E(X2) = (xi2.P(X = xi)
= 0(.008) + 1(.096) + 4(.384) + 9(.512)
= 6.24.
Therefore,
2 = E(X2) 2
= 6.24 2.42 = 0.48,
and
= 0.481/2 0.6928.
http://people.hofstra.edu/stefan_waner/RealWorld/Summary7.html

Jointly Distributed Random Variables


De_nition
Let X and Y be continuous random variables with joint pdf f (x; y). Then
the marginal probability density functions of X and Y , denoted by
fX (x) and fY (y), respectively, are given by

De_nition
Let X and Y be two continuous rv's with joint pdf f (x; y) and marginal Y
pdf fY (y). Then for any Y value y for which fY (y) > 0, the conditional
probability density function of X given that Y = y is

If X and Y are discrete, then conditional probability mass function of


X given that Y = y is

http://www.math.utah.edu/~lzhang/teaching/3070summer08/DailyUpdates/jul3/sec5_
1.pdf

Das könnte Ihnen auch gefallen