Sie sind auf Seite 1von 8

# Chapter 5 Random vectors, Joint distributions Lectures 18 -23

In many real life problems, one often encounter multiple random objects. For example, if one is interested in the future price of two different stocks in a stock market. Since the price of one stock can affect the price of the second, it is not advisable to analysis them separately. To model such phenomenon, we need to introduce many random variables in a single platform (i.e., a probability space). First we will recall, some elementary facts about -dimensional Euclidean space. Let

A subset

of

## is said to be open if for each

, there exists an

such that

where

Any

open

set

can

be

written

as

countable

union

of

open

sets

of

the

form

## Definition 5.1. The

of and is denoted by

## is called the Borel

-field of subsets

Then

, for

, we have

For each

such that

we have

## . Therefore from the definition of

This completes the proof. (It is advised that student try to write down the proof for

vector if

## be a probability space. A map

, is called a random

(for simplicity)

Theorem 5.0.17
where denote the

## are random variables

Proof: Let
For

be a random vector.

since Therefore Suppose For (5.0.1) Set is a random variable. Similarly, we can show that are random variables. is a random variable.

By (5.0.1) (5.0.2)

For

, we have

Hence

Thus

. Similarly

Hence

. Hence

## Theorem 5.0.18 Let

be a random vector. On

define

as follows

Then

Proof. Since

Let

. Then

are pair

denoted by .

and is

Let

given by

## Theorem 5.0.19 Let

the following. (i) (a)

. Then

satisfies

(b)

(ii) (iii)

## is right continuous in each argument. is nondecreasing in each arguments.

The proof of the above theorem is an easy exercise to the student. Given a random vector the marginal distribution of , the distribution function of denoted by of is called is defined.

Similarly

and

## , in general it is impossible to construct the and

joint distribution function. Note that marginal distribution functions doesn't contain information about and vice versa. One can characterize the independence of

in terms of its joint and marginal distributions as in the following theorem. The proof is beyond the scope of this course.

## Theorem 5.0.20 Let

and are independent iff

. Then

## Definition 5.5. (joint pmf of discrete random vector) Let

vector, i.e, are discrete random variables. Define by

be a discrete random

Then

## Definition 5.6. (joint pdf of continuous random vector) Let

random variable (i.e., there exists a function

be a continuous . If

are continuous random variables) with joint distribution function such that

then

## be a continuous random vector with joint pdf

. Then

Proof. Note that L.H.S of the equality corresponds to the law of Let student). Set denote the set of all finite union of rectangles in . Then

Then

and

on

theorem, we have

i.e.,

If

## denote the marginal pdfs of

and

respectively, then

Therefore

Here

means

and variance

. Similarly,

Therefore

and

see exercise.

is given by

where

## denote the convolution of

and

and is defined as

Proof. Let

. Set

Therefore

## Example 5.0.35 Let

respectively. Then

and

## is given similarly. Now for

, clearly

. For

Conditional Densities. The notion of conditional densities are intended to give a quantification of dependence of one random variable over the other if the random variables are not independent. Definition 5.7. Let
density of given be two discrete random variables with joint pmf denoted by is defined as . Then the conditional

Intuitively,

## . Here information about for each . One can rewrite

means knowledge about the occurrence (or non occurrence) of in terms of the pmfs as follows.

## Definition 5.8. Let

distribution of given

## are continuous random variables with joint pdf is defined as

. The conditional

Example 5.0.36
variable over

Let . i.e.,

## be uniform random variable over

and

be uniform random

given

is

, i.e.

Also

Hence