Sie sind auf Seite 1von 9

Motivation

ƒ Our simulator needs random variables


Generating Random Numbers ƒ Examples: Arrival times, service times

ƒ Random variables are simulated using random numbers

ƒ We must therefore be able to generate random


numbers

ƒ We use a deterministic algorithm

ƒ In fact, we generate pseudo-random numbers

Introduction to Simulation WS05/06 - L 06 1/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 2/36 Sanja Lazarova-Molnar

Properties of RNG Properties of RNG

ƒ A sequence of computer-generated RN must be... ƒ Properties of a pseudo-random number generator:


ƒ uniformly distributed on (0..1) ƒ Independence
ƒ independent
ƒ Uniformity
ƒ Speed
ƒ Counterexamples:
ƒ Portability
ƒ Long cycle
0.11, 0.88, 0.23, 0.79, 0.15, 0.33, 0.93, 0.14 ƒ Replicability

0.11, 0.19, 0.23, 0.31, 0.45, 0.52, 0.66, 0.75

Introduction to Simulation WS05/06 - L 06 3/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 4/36 Sanja Lazarova-Molnar
Linear Congruential Method Linear Congruential Method

ƒ The most important type of RNG is called ƒ The LCM needs three integer parameters a, c and m
ƒ The Linear Congruential Method (LCM)
ƒ It generates a sequence of integers 0<Xi <m :
ƒ The LCM is used in almost all simulators
Xi = (a ·Xi -1+ c) mod m
ƒ It (can) fulfil all the desired properties
ƒ The random numbers 0<Ri <1 that we need are then
ƒ It is very easy to implement
Ri = Xi / m

ƒ The starting value X0 is called the seed

Introduction to Simulation WS05/06 - L 06 5/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 6/36 Sanja Lazarova-Molnar

Linear Congruential Method Linear Congruential Method

ƒ The sequence of RN is periodic ƒ Example: a = 13, m = 64, c = 0, X0 = 1, 2, 3, and 4

ƒ The quality of the RN produced depends on a, c and m


i 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
ƒ How to best choose a, c and m ?
Xi 1 13 41 21 17 29 57 37 33 45 9 53 49 61 25 5 1

ƒ Choice is made in order to achieve maximum period P Xi 2 26 18 42 34 58 50 10 2

ƒ Different choices yield strongly differing P Xi 3 39 59 63 51 23 43 47 35 7 27 31 19 55 11 15 3

Xi 4 52 36 20 4

Introduction to Simulation WS05/06 - L 06 7/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 8/36 Sanja Lazarova-Molnar
Linear Congruential Method The Combined LCM

ƒ Examples: ƒ Sometimes the period of an LCM is not long enough

m = 2b š c z 0 š a = 4k+1 š c relatively prime to m ƒ The combined LCM yields longer periods


ŸP=m
ƒ The CLCM is based on the following observation:
m = 2b š c = 0 š X0 is odd š a = 8k + 3 or a = 8k+5
Ÿ P = m/4 ƒ Given independent, discrete-valued RVs Wi,1, Wi,2 ... Wi,k
with Wi,1 ~ U (0, m1-2)

ƒ (There is lots of theory on random number generation!)


ƒ then Wi = (6 Wi,j ) mod m1-1 is ~ U (0, m1-2)

Introduction to Simulation WS05/06 - L 06 9/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 10/36 Sanja Lazarova-Molnar

The Combined LCM The Combined LCM

ƒ Example : ƒ Definition of the CLCM:


ƒ Consider two dice W1, W2 ~ U [0..5]
ƒ Given Xi,1, Xi,2 ... Xi,k from different LCM with c =0,
ƒ The result for the sum is: with mj prime and Pj = mj -1
xi 0 1 2 3 4 5 6 7 8 9 10
p(xi) 1 2 3 4 5 6 5 4 3 2 1 ƒ Compute Xi = (6 (-1)j -1 Xi,j) mod m1 -1

ƒ Set yi = 6 xj , j mod 6 = i and compute p(yi): ƒ Then the following period P can be achieved:
yi 0 1 2 3 4 5
p(yi) 6 6 6 6 6 6 P = (m1-1) (m2-1) ... (mk-1) / 2k-1

ƒ The result is ~ U [0..5]

Introduction to Simulation WS05/06 - L 06 11/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 12/36 Sanja Lazarova-Molnar
Tests for Random Numbers The Null Hypothesis

ƒ It is important that the RN generated are ƒ The null hypothesis is an important concept in statistics
ƒ uniformly distributed on (0..1)
ƒ independent ƒ We say for example: H0: Ri ~ U [0,1]

"The numbers Ri are distributed according to U [0,1]"


ƒ There are many tests for checking these properties,
including
ƒ Our tests attempt to reject this null hypothesis
ƒ Frequency test
ƒ Runs test
ƒ Failure to reject H0 means that no evidence of non-uniformity
ƒ Autocorrelation test could be found
ƒ Poker test
ƒ This does not imply that the Ri are uniformly distributed!

Introduction to Simulation WS05/06 - L 06 13/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 14/36 Sanja Lazarova-Molnar

Frequency Tests Kolmogorov–Smirnov Test

ƒ Frequency tests test for uniformity ƒ Given


ƒ Sample random numbers to be tested Ri , 1 d i d N
ƒ They compare the RNs with the theoretical distribution ƒ Proposed distribution F (x)

ƒ There are two important tests: ƒ Compute the sample distribution SN (x):
ƒ Kolmogorov - Smirnov test
ƒ Chi-square test SN (x) = (# Ri d x ) / N

ƒ SN (x) should approximate F (x)

ƒ Null hypothesis: H0: Ri ~ F

Introduction to Simulation WS05/06 - L 06 15/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 16/36 Sanja Lazarova-Molnar
Kolmogorov–Smirnov Test Kolmogorov–Smirnov Test

ƒ Compute N =4
D = max | SN (x) – F (x) | 1
SN (x)
0.75 F (x)
ƒ The distribution of D is known when Ri ~ F
ƒ Its values depend on N and D 0.5

ƒ Choose the level of significance D


0.25
ƒ Compare D with the critical value DD from distribution
0
ƒ If D > DD then reject H0
R1 R2 R3 R4

Introduction to Simulation WS05/06 - L 06 17/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 18/36 Sanja Lazarova-Molnar

Chi-Squared Test Chi-Squared Test

ƒ The Chi-squared test is very important


ƒ The F distribution is a standard distribution

ƒ It divides the sample data into k classes ƒ The F distr. has a parameter f = k - 1 (d.o.f.)

ƒ The value of F is described by the F distribution


ƒ It uses the sample statistic
ƒ The Chi-squared test:
F02 = 6 (Oi - Ei)2 / Ei ƒ Compute F
ƒ Choose D
ƒ where
ƒ Compare F with tabulated value FDf
ƒ Oi is the observed number of samples in a class
ƒ Ei is the expected number of samples in a class ƒ If F !FDf then reject H0

Introduction to Simulation WS05/06 - L 06 19/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 20/36 Sanja Lazarova-Molnar
Chi-Squared Test Runs Test

ƒ Meaning of the chi-squared test: ƒ A run is a sequence of ascending or descending values

0.1 0.2 0.4 0.6 0.5 0.3 0.2 0.1


2 < < < > > > >
F f

One run of length 3 and one run of length 4

0.1 0.5 0.4 0.6 0.2 0.9 0.7 0.8


< > < > < > <
D

FDf Seven runs of length 1

Introduction to Simulation WS05/06 - L 06 21/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 22/36 Sanja Lazarova-Molnar

Runs Test Runs Test

ƒ We can count ƒ Example: Lengths of runs


ƒ Number of runs
ƒ Lengths of runs ƒ Yi = # of runs of length i in a sequence of length N:
ƒ Number of runs above/below 0.5
2
ƒ ... E (Yi ) [ N (i 2  3i  1)  (i 3  3i 2  i  4)]
(i  3)!
ƒ In all cases, the procedure is:
ƒ Choose N =100:
ƒ Make the count
ƒ Choose level of significance
E(Y1)=33.42, E(Y2)=18.10, E(Y3)=5.30
ƒ Compare count to expected value using known
distribution
ƒ Hypothesis: The set of numbers is independent

Introduction to Simulation WS05/06 - L 06 23/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 24/36 Sanja Lazarova-Molnar
Runs Test Autocorrelation

ƒ Compute: i Ei Oi (Ei - Oi)2/Ei ƒ Autocorrelation measures dependencies in sequence


1 33.42 30 0.25 ƒ Example:
2 18.10 14 0.93 0.1 0.4 0.9 0.7 0.2 0.8 0.5 0.6 0.9
3 5.30 7 0.66 (Every third value is large)
6 1.94

ƒ Positive autocorrelation: (Ri - 0.5) · (Ri+m - 0.5) > 0


ƒ Negative autocorrelation: (Ri - 0.5) · (Ri+m - 0.5) < 0
ƒ Choose D = 0.1 and set f = k - 1 = 2

F = 1.94 < FDf = 4.61 ƒ Method:


ƒ Compute (Ri - 0.5) · (Ri+m - 0.5) for different m
ƒ No reason to reject the hypothesis of independence ƒ Compare with known distribution

Introduction to Simulation WS05/06 - L 06 25/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 26/36 Sanja Lazarova-Molnar

No Test is Perfect! No Test is Perfect!

ƒ No test for random numbers is perfect! ƒ Plot results for LCM a =37, c =1, m =64 in (x,y) pairs:
ƒ Consider the sequence
1
0.2 0.8 0.7 0.3 0.4 0.6 0.9 0.1
0.9

ƒ This passes all our tests but it is not independent! 0.8

0.7

0.6
ƒ Take numbers as (x,y) pairs:
0.5

0.4
(0.2, 0.8) (0.7, 0.3) (0.4, 0.6) (0.9, 0.1)
0.3

0.2
x
0.1
x
0
0 0.2 0.4 0.6 0.8 1
x
x

Introduction to Simulation WS05/06 - L 06 27/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 28/36 Sanja Lazarova-Molnar
No Test is Perfect! Generating Random Variates

ƒ The IBM generator RANDU: a =216+3, c =0, m =231 ƒ How to generate random numbers of any distribution:
ƒ Plot results in (x, y, z) triplets:
ƒ Given:
ƒ a U (0,1) random number generator
ƒ a cdf F (x) that describes the desired distribution

ƒ Then:
1. Generate y ~ U (0,1)
2. Compute x = F -1(y)

ƒ Required when your simulator doesn't provide the


distribution you need

Introduction to Simulation WS05/06 - L 06 29/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 30/36 Sanja Lazarova-Molnar

Generating Random Variates The Exponential Distribution

ƒ How to generate random numbers of any distribution: ƒ The cdf of the exponential distribution:

F(x) = 1  e Ox
1
F(x)
x
~U(0,1) ƒ Inversion:
y = 1  e Ox
x =  1/O · log(1  y)
0 x
~F

Introduction to Simulation WS05/06 - L 06 31/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 32/36 Sanja Lazarova-Molnar
The Normal Distribution The Weibull Distribution

ƒ There is no closed form for the Normal cdf ƒ The cdf of the Weibull distribution:

ƒ Solutions: E
ƒ Approximate numerically
F(x) = 1  e– (x/D)
ƒ Approximate using alternative formulae

ƒ The same is true for the lognormal distribution ƒ Inversion: E


y = 1  e–(x/D)
x = D · ( log(1  y))1/E

Introduction to Simulation WS05/06 - L 06 33/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 34/36 Sanja Lazarova-Molnar

RN in the Simulation RN in the Simulation

ƒ How does the simulator use the RN? ƒ Sample each RV and choose the smaller value:
ƒ Consider this simple model, starting in state A:
1

0.9
0.8
0.7 X
0.6
0.5
ƒ RV X is "fast", RV Y is "slow": 0.4

0.3
X
1
0.2
0.9
0.1
0.8
0.7
0

0.6
0.5

0.4
ƒ In this case, almost always, X will be chosen
0.3

0.2
ƒ When will the simulator choose Y?
0.1
0 ƒ Many replications will be needed!

Introduction to Simulation WS05/06 - L 06 35/36 Sanja Lazarova-Molnar Introduction to Simulation WS05/06 - L 06 36/36 Sanja Lazarova-Molnar

Das könnte Ihnen auch gefallen