Sie sind auf Seite 1von 16

Lecture 1

Mariana Olvera-Cravioto

Columbia University
molvera@ieor.columbia.edu

September 9th, 2015

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 1/14


Introduction

I A stochastic process is a collection of random variables, e.g.,

{Xn , n ∈ N} or {Yt , t ∈ R}.

I The index can be discrete or continuous.


I The random variables in the collection can be discrete or continuous.
I A stochastic process is usually used to describe the evolution of some
random quantity, e.g., the indexing could represent time.
I The random variables in the collection are usually dependent.
I Our goal is usually to analyze the behavior of the stochastic process,
either of specific subsets of random variables in the collection or of the
limiting behavior of the entire process (if it exists).

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 2/14


Some examples

I The sequence of maximum temperatures in NY over a year.


I The number of people who board the subway at 116th Street on each
train throughout a day.
I The number of red cars that cross the George Washington bridge during a
month.
I The number of Facebook users throughout a year.
I The price of a stock over a three month period.
I A person’s monthly credit card bill from month to month.
I The number of iPads in inventory at the 59th Street Apple store from day
to day.

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 3/14


Probabilistic models

I Definition: An experiment is an event whose outcome is not known


with certainty.
I Definition: The set of possible outcomes of an experiment is called the
sample space, which we will denote S. The outcomes themselves are
called sample points.
I Examples:
I Flipping a coin ⇒ S = {H, T }
I Tossing a 6-sided die ⇒ S = {1, 2, 3, 4, 5, 6}
I Flipping a coin 10 times ⇒
S = {all “words” of length 10 that have letters H and T only}
I The time it will take for your call to be picked up by an airline’s call center
⇒ S = [0, ∞)
I The score in the next Knicks game ⇒
S = {(x, y) : x, y are nonnegative integers}

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 4/14


Probability laws

I A probability law P assigns to each event A ⊆ S a value in [0, 1].


I Let Ω = S be the universe and Ø denote the empty set.
I Axioms: Let A, B ⊆ Ω
I P (A) ≥ 0
I If A ∩ B = Ø, then P (A ∪ B) = P (A) + P (B)
I P (Ω) = 1
I Other properties: Let A, B ⊆ Ω
I If A ⊆ B, then P (A) ≤ P (B)
I P (A ∪ B) = P (A) + P (B) − P (A ∩ B)
I P (Ø) = 0

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 5/14


Independence and conditional probabilities

I Events A and B are independent if

P (A ∩ B) = P (A)P (B)

I For any events A and B in the sample space, with P (B) > 0, the
conditional probability of A given B is defined as

P (A ∩ B)
P (A|B) =
P (B)
I Conditional probabilities specify a probability law.

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 6/14


Random variables and their properties

I Definition: A random variable is a function that assigns a real number


to each point in the sample space.

X:S→R
I Examples:
I Tossing a 6-sided die: X = result of the toss, X ∈ {1, 2, 3, 4, 5, 6}
I The weather tomorrow:
( let
1, if it rains
X= , X ∈ {0, 1}
0, if it does not rain.

I Let X be the time you will have to wait for the subway next time you take
it. X ∈ [0, ∞)
I Let X be the Knicks’ score in their next game and let Y be that of their
opponent. Let Z = (X, Y ) be the overall score of the game.
Z ∈ {(x, y) : x, y are nonnegative integers}

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 7/14


Indicator random variables

I In general, for any event A, the random variable


(
1, if A happens
X=
0, if A does not happen (Ac happens)

is known as an “indicator” random variable.


I Indicator random variables are very useful in computing probabilities, since

E[X] = 0 · P (Ac ) + 1 · P (A) = P (A)

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 8/14


Distribution functions

I Definition: The (cumulative) distribution function, CDF, of a random


variable X is defined as

F (x) = P (X ≤ x) −∞<x<∞
I Properties:
I 0 ≤ F (x) ≤ 1 for all x
I F (x) is nondecreasing (i.e. if x1 < x2 then F (x1 ) ≤ F (x2 ))
I lim F (x) = 1
x→∞
I lim F (x) = 0
x→−∞

All random variables have a CDF!

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 9/14


Discrete random variables

I A random variable X is said to be discrete if it can take on at most a


countable number of values, say x1 , x2 , x3 , . . . .
I Examples:
I the outcome of the toss of a die
I the outcome of the flip of a coin
I the number of “heads” in the first 10 tosses of a coin
I the number of cars that pass in front of Campus Walk and Broadway
between 8:00 am and 6:00 pm.
I the number of points that the Giants can score in their next game

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 10/14


Probability mass function
I The probability that a discrete random variable X takes on the value xi is
given by
p(xi ) = P (X = xi ) i = 1, 2, . . .
I It follows that p(x) is a probability, and therefore must satisfy

X
p(xi ) = 1
i=1

I The function p(x) is called the probability mass function, PMF.


I For any A ⊆ {x1 , x2 , . . . } we have
X
P (X ∈ A) = p(xi )
xi ∈A

in particular, X
F (x) = P (X ≤ x) = p(xi )
xi ≤x

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 11/14


Using the PMF to compute probabilities

Experiment 1: Toss a fair coin 10 times. What is the probability that


there will be at most 2 heads?

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 12/14


Using the PMF to compute probabilities

Experiment 1: Toss a fair coin 10 times. What is the probability that


there will be at most 2 heads?
Let Y = number of heads in 10 tosses of the coin, Y ∈ {0, 1, 2, . . . , 10}.
We want to compute P (Y ≤ 2)

P (Y ≤ 2) = p(0) + p(1) + p(2)


 10    9      8  2
1 10 1 1 10 1 1
= + +
2 1 2 2 2 2 2
 10
1
= [1 + 10 + 45]
2
7
= 7
2

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 12/14


Experiment 2: Toss a fair 6-sided die until the first 6 shows up. What is
the probability that we will need to toss the die at least 4 times?

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 13/14


Experiment 2: Toss a fair 6-sided die until the first 6 shows up. What is
the probability that we will need to toss the die at least 4 times?
Let X = number of tosses required to get the first 6, X ∈ {1, 2, 3, . . . }.
We want to compute P (X ≥ 4).

P (X ≥ 4) = 1 − P (X < 4)
= 1 − P (X ≤ 3)
= 1 − p(1) − p(2) − p(3)
 2
1 5 1 5 1
=1− − · −
6 6 6 6 6
3
5
= 3
6

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 13/14


Brand name discrete random variables
I Discrete uniform: (parameter n) n equally likely events.
x ∈ {1, 2, . . . , n}.
1
p(x) =
n
I Binomial: (parameters n, p) number of successes in n independent
repetitions of an experiment. x ∈ {0, 1, . . . , n}.
 
n x
p(x) = p (1 − p)n−x
x
I Geometric: (parameter p) number of trials required to obtain the first
success. x ∈ {1, 2, . . . }.

p(x) = (1 − p)x−1 p

I Poisson: (parameter λ) x ∈ {0, 1, 2, . . . }.


λx
p(x) = e−λ
x!

IEOR 3106, Intro to OR: Stochastic Models Lecture 1 14/14

Das könnte Ihnen auch gefallen