100%(7)100% fanden dieses Dokument nützlich (7 Abstimmungen)

1K Ansichten91 Seitenprof c.s.p.rao,nit waranagal

Jul 21, 2008

© Attribution Non-Commercial (BY-NC)

PPT, PDF, TXT oder online auf Scribd lesen

prof c.s.p.rao,nit waranagal

Attribution Non-Commercial (BY-NC)

Als PPT, PDF, TXT **herunterladen** oder online auf Scribd lesen

100%(7)100% fanden dieses Dokument nützlich (7 Abstimmungen)

1K Ansichten91 Seitenprof c.s.p.rao,nit waranagal

Attribution Non-Commercial (BY-NC)

Als PPT, PDF, TXT **herunterladen** oder online auf Scribd lesen

Sie sind auf Seite 1von 91

Prof C S P Rao

Dept. of Mechanical Engg

N I T, Warangal

Who was Markov?

Andrei A. Markov graduated from Saint

Petersburg University in 1878 and

subsequently became a professor there.

and analysis, etc.

study of Markov chains.

2

Markov Process

Markov process is a simple stochastic

process in which the distribution of

future states depends only on the

present state and not on how it arrived

in the present state.

stochastic process with the Markov

property

3

Terminology

A Markov chain is irreducible if every

state can be reached from every other

state.

Each state of a Markov chain is either

transient or recurrent.

A state is called absorbing, if once the

chain reaches that state, it stays there

forever.

A Markov chain is acyclic, if once it

leaves any state, it never returns to that

state. 4

Markov Property

Many systems have the property that,

given the present state, the past states

have no influence on the future. This

property is called Markov property.

or simulation that satisfies Markov

property.

5

Markov Chain

Let {Xt : t is in T} be a stochastic process with

discrete-state space S and discrete-time space

T satisfying

P(Xn+1 = j|Xn = i, Xn-1 = in-1, · · ·,X0 = i0)

= P(Xn+1 = j|Xn = i)

for any set of state i0, i1, · · · , in-1, i, j in S and n

≥ 0 is called a Markov Chain.

6

Markov Model

Sometimes Markov Model

restricts attention to Markov

chains with stationary transition

probabilities. But some people

tend to avoid this usage for sake of

confusion.

Markov Model is also used to

refer to all Markov processes

that satisfying Markov Property. 7

Components of Stochastic Processes

the set of all values that the Xt’s can take.

(we will be concerned with

stochastic processes with a finite # of states )

Time: t = 0, 1, 2, . . .

State: m-dimensional vector, s = (s1, s2, . . . , sm )

or s = (s1, s2, . . . , sm ) or (s0, s1, . . . , sm-1)

Sequence Xt, Xt takes one of m values, so Xt ↔ s.

8

Markov Chain Definition

The future behavior of the system depends only on the

current state i and not on any of the previous states.

9

Discrete-Time Markov

Chains

10

Discrete-Time Markov

Chain

A stochastic process { Xn } where n ∈ N = { 0, 1, 2, . . . } is

called a discrete-time Markov chain if

Pr{ Xn+1 = j | X0 = k0, . . . , Xn-1 = kn-1, Xn = i }

The future behavior of the system depends only on the

current state i and not on any of the previous states.

11

Stationary Transition

Probabilities

Pr{ Xn+1 = j | Xn = i } = Pr{ X1 = j |X0 = i } for all n

(They don’t change over time)

We will only consider stationary Markov chains.

with states S = { 0, 1, 2 } is

p00 p01 p02

P p10 p11 p12

p20 p21 p22

12

Example

single machine that is prone to failures.

At any instant of time the machine is either

Working properly (state ‘0’) or

being examined once every hour

Let ‘a’ be the probability that the machine

fails in a given hour

‘b’ the probability that the failed machine

gets repaired in a given hour

13

More specifically,

‘a’ is the probability that the machine is in the

failed condition by the next observation given

that it is working in the current observation

machine gets repaired by the next observation

given that it is in the failed state in the current

observation.

same immaterial of the history of the working

of the machine.

the same at any observation epoch.

Formulation of the system

The above system can be formulated as a

homogeneous DTMC

instants t0, t1, t2 …. Corresponds to 0, 1h, 2h,….

respectively.

1 a a

P 0 a, b 1

b 1 b

DTMC models for a single machine system

16

Example

belong to one of three part types.

Assume that raw parts of each type are always available

and wait in separate queues.

We consider the system evaluation as constituting a

homogeneous Markov chain, with state space given by

S = {1,2,3} where each state corresponds to the type

of part getting processed.

The observation epochs t0, t1, t2,… are given by the time

instants at which the ith part (i =1,2,3….) is taken up for

processing.

17

An NC machine processing three types of parts

18

Suppose that the NC machines processes the

parts in a cyclic way i.e. type 1 part followed

by type 2, followed by type 3 part and the

same sequence is repeated again and again.

The state transition diagram is shown below

0 1 0

The TPM of the system is

P 0 0 1

1 0 0

Suppose that the type of the next part to be

processed is selected probabilistically: part

type i is chosen next with probability qi = (i =

1,2,3…), where 0 <qi <1, q1+q2+q3 = 1.

In this case, the state transition diagram and

TPM are shown in figure below

q1 q2 q3

P q1 q2 q3

q1 q2 q3

Example 3

Consider the queuing network shown in figure

below.

After transportation by the AGV, the part leaves the system with

probability q0 or undergoes one more operation on one of the

machines with probabilities q1, q2, …. Qm.

21

Let there be a single job in this system and let us look at the

progress of this job in the system.

It is easy to note that the job can be in (m+1) states.

0 (if getting serviced by AGV)

i (if getting served by Mi, i=1,2,…m)

The flow of the job can be described by a DTMC model.

The TPM of this model is q0 q1 q2 ....qm

1 0 0 ....0

P 1 0 0 ....0

.......................

1 0 0 ....0

DTMC model for central server network

23

An Example Problem -- Gambler’s

Ruin

At time zero, I have X0 = $2, and each day I make a $1 bet.

I win with probability p and lose with probability 1– p.

I’ll quit if I ever obtain $4 or if I lose all my money.

Xt = amount of money I have after the bet on day t.

3 with probability p

So, X1 = { 1 with probability 1 – p

=4

if Xt = 0 then Xt+1 = Xt+2 = •••

= 0.

Property of Transition Matrix

go somewhere) has prob ≥ 0)

does not change with time

25

Transition Matrix of the Gambler’s

problem

At time zero I have X0 = $2, and each day I make a $1 bet.

I win with probability p and lose with probability 1– p.

I’ll quit if I ever obtain $4 or if I lose all my money.

Xt = amount of money I have after the bet on day t.

0 1 2 3 4

0 1 0 0 0 0

1 1-p 0 p 0 0

2 0 1-p 0 p 0

3 0 0 1-p 0 p

State Transition Diagram

Node for each state,

Arc from node i to node j if pij > 0

1-p p p p

0 1

1

22 3

3 4

4

1-p 1-p

1 1

27

Printer Repair Problem

• Two printers are used for Russ Center.

will fail by evening and a 10% chance that both will fail.

there is a 20% chance that it will fail by the close of

business.

work to a printing service.

overnight and be returned earlier the next day

28

States for Computer Repair Example

No printers have failed. The office

0 s0 = (0) starts the day with both computers

functioning properly.

1 s1 = (1) starts the day with one working

computer and the other in the shop

until the next morning.

2 s2 = (2) work must be sent out for the day.

Events and Probabilities for Computer

Repair Example

Current Events Probabili Next

Index

state ty state

other is returned. 0.8

other is returned. 0.2

State-Transition Matrix and Network

State-Transition Matrix

The major properties of a Markov chain 0.6 0.3 0.1

can be described by the m × m matrix:

P = 0.8 0.2 0

P = (pij). 1 0 0

For printer repair example

(0.6)

State-Transition Network:

0

Node for each state,

(0.1) (0.3)

Arc from node i to node j if pij > 0.

(1) (0.8)

2 1

For printer repair example

(0.2)

Market Share/Brand Switching

Problem

Market Share Problem:

You are given the original market of three companies. The

following table gives the number of consumers that

switches from brand i to brand j in two consecutive weeks

d

(i)

1 90 7 3 100

2 5 205 40 250

3 30 18 102 150

Total 125 230 145 500

How to model the problem as a stochastic process ?

Empirical Transition Probabilities for

Brand Switching, pij

Transition Matrix

Brand (a) 1 2 3

(i)

0.90

2 5/250 = 0.02 205/250 = 40/250 = 0.16

0.82

3 30/150 = 18/150 = 0.12 102/150 =

0.20 0.68

33

Assumption Revisited

• Markov Property

Pr{ Xt+1 = j | X0 = k0, . . . , Xt-1 = kt-1, Xt = i }

• Stationary Property

We will only consider stationary Markov chains.

34

Transform a Process to a Markov

Chain

be transformed into a Markov chain by expanding the

state space.

Example: Suppose that the chance of rain tomorrow

depends on the weather conditions for the previous two days

(yesterday and today).

Specifically,

P{ rain tomorrowrain last 2 days (RR)} = .7

P{ rain tomorrowrain today but not yesterday (NR)} = .5

P{ rain tomorrowrain yesterday but not today (RN)} = .4

P{ rain tomorrowno rain in last 2 days (NN)} = .2

The Weather Prediction Problem

0 (RR) 1 (NR) 2(RN) 3(NN)

0 ( RR) .7 0 .3

1 (NR) .5 0 .5

2 (RN) 0 .4 0

3 (NN) 0 .2 0

36

Choosing Balls from an Urn

An urn contains two unpainted balls at present. We

choose a ball at random and flip a coin.

up heads, we paint the chosen unpainted ball red

tails, we paint the chosen unpainted ball blue.

If the ball has already been painted, then (whether

heads or tails has been tossed), we change the color of

the ball (from red to blue or from blue to red)

(represent it using state diagram & transition matrix)

Insurance Company Example

An insurance company charges customers annual

premiums based on their accident history

in the following fashion:

• Accidents in each of last 2 years: $800 annual premium

• Accident in only 1 of last 2 years: $400 annual premium

Historical statistics:

2. If a customer had an accident last year then they

have a 10% chance of having one this year;

3. If they had no accident last year then they have a

3% chance of having one this year.

38

Find the steady-state probability and the long-run

average annual premium paid by the customer.

states: (N, N), (N, Y), (Y, N), (Y,Y) where these indicate

(accident last year, accident this year).

39

State-Transition Network for

Insurance Company

.03 Y, N Y, Y

.97 N, N N, Y

.10

.97 .10

All states communicate (irreducible);

Each state is recurrent (you will return, eventually);

Each state is aperiodic.

40

Solving the steady – state equations:

π(N,Y) = 0.03 π(N,N) + 0.03 π(Y,N)

π(Y,N) = 0.9 π(N,Y) + 0.9 π(Y,Y)

π(N,N) + π(N,Y)+π(Y,N) + π(Y,Y) = 1

Solution:

π(N,N) = 0.939, π(N,Y) = 0.029, π(Y,N) = 0.029, π(Y,Y) = 0.003

0.939*250 + 0.029*400 + 0.029*400 + 0.003*800 = 260.5

First Passage Times

Let µij = expected number of steps to transition

from state i to state j

given that we start in i is less than one then

we will have µij = +∞.

µ20 = +∞ because there is a positive probability

that we will be absorbed in state 4 given that we

start in state 2 (and hence visit state 0).

42

Computations for All States Recurrent

If the probability of eventually visiting state j given

that we start in i is 1 then the expected number

of steps until we first visit j is given by

r≠j

at least one step. with probability pir and it takes µrj

steps from r to j.

m unknowns mij , i = 0,1, . . . , m–1.

43

First-Passage Analysis for Insurance

Company

Suppose that we start in state (N,N) and want to find

the expected number of years until we have accidents

in two consecutive years (Y,Y).

For convenience number the states

0 1 2 3

(N,N) (N,Y) (Y,N) (Y,Y)

(N, N) (N, Y) (Y, N) (Y, Y)

0

1 (N, N) .97 .03 0 0

Using P =

2

(N, Y) 0 0 .90 .10

3

(Y, N) .97 .03 0 0

µ03 0=

(Y, Y) 1 + 00.97µ03

.90 + 0.03µ

.10 13

µ13 = 1 + 0.9µ23

µ23 = 1 + 0.97µ03 + 0.03µ13

from (N,N) to (Y,Y).

Continuous Time Markov

Chains

Continuous Time Markov Chains

variables indexed by an ordered set T.

Generally, T records time.

A discrete-time stochastic process

is a sequence of random variables

X0, X1, X2, . . . typically denoted by { Xn }.

where index T or n takes countable discrete values

1) discrete: we do not care how long each step takes

2) Markov Property:

What happens only depends on the previous state,

not on how you get there.

47

Continuous Time Markov

Chain

A Continuous Time Markov Chain

is a sequence of random variables Xt , t ≥ 0

which satisfies the following

Markovian Property

Pr{ Xt+s = j | Xu, 0 ≤ u ≤ s } = Pr { X t+s = j | Xs }

depends only on the current state Xs

but not on the past Xu, where 0<u<s

Stationary Property

Pr{ Xt+s = j | Xs } = Pr { X t = j | X0 }

48

Property of Continuous Time Markov

Chain

start at state i, and have entered state i for about s

minutes, what is the probability that a transaction will

not occur in the next t times?

According to Markovian Property

Look at Inter-Arrival Time T

Pr{ T i > t + s | T i > s, 0 ≤ u ≤ s } = Pr { T i > t }

There is only one random variable, exponential, that

satisfies this memory-less property

49

Continuous Time Markov Chain

Definition

Examples

Poisson Process

The Birth/Death Processes in General

The M/M/s queue

Brown Motion

Illustration

The M/M/s queue

50

An ATM example (M/M/1/5 Queue)

person can use the machine, so a queue forms when two

or more customers are present.

The foyer is limited in size and can hold only five people.

When there are more than five people, arriving customers

will balk when the foyer is full

is 30 seconds, or 2 customer per minutes whereas the time

for service averages 24 seconds. or 2.5 customers per

minutes. Both times follows exponential distribution

following questions.

An ATM example (M/M/1/5 Queue)

Manager’s Questions

Customer’s Questions

system is full?

f) The average number of time in the system?

DTMT Model for the ATM example

What are the State Space of the system

Number of customers in the system: {0}, {1}, {2}, {3}, {4}, {5}

(BTW, can you model this problem as a DTMC?)

λ0 λ1 λ2 λ3 λ4

0 1 2 3 4 5

µ1 µ2 µ3 µ3 µ3

Steady State Probability

What is the long term steady state

probability?

for CTMC based on the same

Flow Balance Principle

(Rate in = Rate out)

54

Balance Equations

λ λ λ λ λ

0 1 2 3 4 5

µ µ µ µ µ

Flow into 1 → λπ0 + µπ2 = (λ + µ)π1 ← flow out of 1

Flow into 2 → λπ1 + µπ3 = (λ + µ)π2 ← flow out of 2

Rate Matrix & Solution with

M/M/1/5

Rate Matrix

State

s 0 1 2 3 4 5

0 0 2 0 0 0 0

1 2.5 0 2 0 0 0

2 0 2.5 0 2 0 0

3 0 0 2.5 0 2 0

4 0 0 0 2.5 0 2

5 0 0 0 0 2.5 0

Solution

State 0 State 1 State 2 State 3 State 4 State 5

0.271 0.217 0.173 0.139 0.111 0.0888

Solution Analysis

Manager’s Questions

a) The proportion of time that the ATM is idle ? π0 =27%

b) The efficiency of the ATM? 1−π0 = 73%

c) The throughput rate of the system? λ(1−π5) = 1.822

d)What is the average number of customers in the system?

1π1 +2π2 +3π3 +4π4 +5π5 = 1.868

Customer’s Questions

d) The proportion of time a customer obtain immediate service?

π0 =27%

e) The proportion of a customer find the system is full?

π5 =9%

f) The average time in the system? Little’s Law, see queuing

Additional Questions

What if we want to add a new ATM machine, what will the

system perform? M/M/2/5

What if we want to add two new ATM machines, what will

the system perform? M/M/3/5

can wait. What will the system perform? M/M/1/8

can wait. What will the system perform? M/M/1/8

What if we want to add a teller them with a service time

exponential distributed at 1 minute a customer, what will

the system perform? This is not a Standard Queue

M/M/2/5

will the system perform? M/M/2/5

What will the state-transition network looks like?

λ λ λ λ λ

0 1 2 3 4 5

µ 2µ 2µ 2µ 2µ

59

Rate Matrix & Solution with

M/M/2/5

Rate Matrix

State

s 0 1 2 3 4 5

0 0 2 0 0 0 0

1 2.5 0 2 0 0 0

2 0 5 0 2 0 0

3 0 0 5 0 2 0

4 0 0 0 5 0 2

5 0 0 0 0 5 0

Solution

State 0 State 1 State 2 State 3 State 4 State 5

0.431 0.345 0.137 0.055 0.022 0.009

M/M/3/5

the system perform? M/M/3/5

λ λ λ λ λ

0 1 2 3 4 5

µ 2µ 3µ 3µ 3µ

61

Rate Matrix & Solution with M/M/3/5

Rate Matrix

State

s 0 1 2 3 4 5

0 0 2 0 0 0 0

1 2.5 0 2 0 0 0

2 0 5 0 2 0 0

3 0 0 7.5 0 2 0

4 0 0 0 7.5 0 2

5 0 0 0 0 7.5 0

Solution

State 0 State 1 State 2 State 3 State 4 State 5

0.448 0.359 0.143 0.038 0.010 0.0027

Comparison of different alternatives

M/M/1/5 and M/M/2/5 and M/M/3/5

Average time in

queue 0.625 0.064 0.008

Proportion of

customers who wait 0.64 0.2152 0.0484

Proportion of

customers lost 0.0888 0.0088 0.0027

The addition of spaces to the

foyer

can wait. What will the system perform? M/M/1/8

λ λ λ λ λ

0 1 2 3 …. 8

µ µ µ µ µ

can wait. What will the system perform? M/M/1/12

λ λ λ λ λ

0 1 2 3 …. 12

µ µ µ µ µ

64

Comparison of different alternatives

M/M/1/5 and M/M/2/5 and M/M/3/5

Average time in

queue 0.625 0.955 1.246

Proportion of

customers who wait 0.64 0.73 0.77

Proportion of

customers lost 0.0888 0.038 0.01565

Adding a Human Server

The manager decide to add a human teller with a service

time exponential distributed at 1 minute a customer

would prefer to go to the human server first if the server is

available?

In this case, how would the system perform?

Approach

Notice you have to differentiate the two server now

Let us use (HS, MS, # waiting) to represent the system,

suppose the foyer can hold at most 5 people

(0,0,0), (0,1,0), (0,1,0), (1,1,0), (1,1,1), (1,1,2), (1,1,3)

Addition of a Human

Server

State: (HS, MS, # waiting)

λ (1,0,0) λ λ λ λ

µm 1,1,0) 1,1,1) 1,1,2) 1,1,3)

(0,0,0) µh

µh

(0,1,0) µm+ µh µm+ µh µm+ µh

µm λ

µh = 1, human service rate

67

Addition of Human Server

Rate Matrix 0,0, 0,1, 1,0, 1,1, 1,1, 1,1, 1,1,

States 0 0 0 0 1 2 3

0,0,0 0 0 2 0 0 0 0

0,1,0 2.5 0 0 2 0 0 0

1,0,0 1 0 0 2 0 0 0

1,1,0 0 1 2.5 0 2 0 0

1,1,1 0 0 0 3.5 0 2 0

1,1,2 0 0 0 0 3.5 0 2

1,1,3 0 0 0 0 0 3.5 2

Solution

States

Solutio 0,0,0

0.21 0,1,0 1,0,0

0.31 1,1,0

0.20 1,1,1

0.11 1,1,2 1,1,3

n 4 0.046 3 5 7 0.067 0.038

A Queue With Finite Input Sources

to handle breakdowns

distributed with breakdown rate of 1/3 per month

The company is thinking of setting up several service

bays and the estimate repair time is exponential

distributed with a rate of 4 per month

how many service bay to set up

69

A Queue With Finite Input Sources

One Bay

6×1/3 5×1/3 4×1/3 3×1/3 1×1/3

0 1 2 3 …. 6

4 4 4 4 4

Two Bay

6×1/3 5×1/3 4×1/3 3×1/3 1×1/3

0 1 2 3 …. 6

70

A Queue With Finite Input Sources (1

Bay)

Rate Matrix

States 0 1 2 3 4 5 6

0 0 2 0

1.66 0 0 0 0

1 4 0 7 0

1.33 0 0 0

2 0 4 0 3 0 0 0

3 0 0 4 0 1 0

0.66 0

4 0 0 0 4 0 7 0

0.33

5 0 0 0 0 4 0 3

6 0 0 0 0 0 4 0

Solution

States

Solutio 0

0.55 1 2

0.11 3

0.03 4

0.01 5

0.00 6

n 6 0.278 6 9 0 2 0

A Queue With Finite Input Sources (2

Bay)

Rate Matrix

States 0 1 2 3 4 5 6

0 0 2 0

1.66 0 0 0 0

1 4 0 7 0

1.33 0 0 0

2 0 8 0 3 0 0 0

3 0 0 8 0 1 0

0.66 0

4 0 0 0 8 0 7 0

0.33

5 0 0 0 0 8 0 3

6 0 0 0 0 0 8 0

Solution

States

Solutio 0

0.61 1 2

0.06 3

0.01 4

0.00 5 6

n 6 0.308 4 1 1 0 0

Economics With These Results

$1200 a day, what would be the expected revenue for

each configuration?

+2400×π4+1200×π5+0×π6 =$6630

+2400×π4+1200×π5+0×π6 =$6392

If it costs $300 dollars a day to operate a bay, would it

be beneficial to the company

73

Probability Transitions:

Service with Rework

Consider a machine operation in which there is a 0.4

probability that on completion, a processed part will

not be within tolerance.

If the part is unacceptable, the operation is repeated

immediately. This is called rework.

Assume that the second try is always successful

What will the system looks like if

a) Arrivals can occur only when the machine is idle

b) Arrivals can occur any time

74

Probability Transitions:

Service with Rework

a) Arrivals can occur only when the machine is idle

a 0.4d1

i w rw

0.6d1

0.6d2

a a

0,r 1,r 2,r

0.6d2 0.6d2

0.4d1 0.4d1 0.4d1 ……

a a a

0,i 0,w 1,w 2,w

0.6d1 0.6d1

75

An ATM with a Human Server

Consider an ATM located together with a Human Server at

the foyer of a bank. The foyer is limited in size and when

there are more than five people, arriving customers will balk

Statistics indicates that the average time between arrivals

is exponential distributed with an average of 30 seconds, or

2 customer per minute

The service time of the ATM is exponential distributed with

an average of 24 seconds. or 2.5 customers per minutes.

The service time of human server is exponential distributed

with an average of 1 minute per customer.

It is further assumed that when a customer enters into the

system, he/she would prefer to go to the human server first

if the server is available.

CTMC Model for ATM and

Human Server

State: (HS, MS, # waiting)

λ (1,0,0) λ λ λ λ

µm 1,1,0) 1,1,1) 1,1,2) 1,1,3)

(0,0,0) µh

µh

(0,1,0) µm+ µh µm+ µh µm+ µh

µm λ

µh = 1, human service rate

77

CTMC Model for ATM and Human

Server

Rate Matrix 0,0, 0,1, 1,0, 1,1, 1,1, 1,1, 1,1,

States 0 0 0 0 1 2 3

0,0,0 0 0 2 0 0 0 0

0,1,0 2.5 0 0 2 0 0 0

1,0,0 1 0 0 2 0 0 0

1,1,0 0 1 2.5 0 2 0 0

1,1,1 0 0 0 3.5 0 2 0

1,1,2 0 0 0 0 3.5 0 2

1,1,3 0 0 0 0 0 3.5 2

Solution

1,1,

States

Solutio 0,0,0

0.21 0,1,0 1,0,0

0.31 1,1,0 1 1,1,2

0.20 0.11 0.06 1,1,3

0.03

n 4 0.046 3 5 7 7 8

Birth & Death Process

Pure Birth Process; e.g., Hurricanes

0

a0

1

a1

2

a2

3

a3

4 …

0 1 2 3 4 …

d1 d2 d3 d4

in Queuing Theory

d1 d2 d3 d4

0 1 2 3 4 …

a0 a1 a2 a3

79

Pure Birth Process – Poisson Process

0

a0

1

a1

2

a2

3

a3

4 …

State

s 0 1 2 3 … …

0 0 λ 0 0 0 0

1 0 0 λ 0 0 0

2 0 0 0 λ 0 0

3 0 0 0 0 λ 0

… 0 0 0 0 0 λ

… 0 0 0 0 0 0

80

Poisson Process

NO Steady State Probability

The number in the system is increasing with time

81

Pure Death Process

0 1 2 3 4 …

d1 d2 d3 d4

The embedded Markov Chain is not ergodic

The number in the system is decreasing with time

82

An Example

other customers in the bank. One being served and the

other four waiting in line. You join the end of the line. If

What is the

the service prob.

time are that you will bewith

all exponential served

rate in 10 minutes ?

5 minutes.

What is the prob. that you will be served in 10 minutes ?

What is the expected waiting time before you are served?

What is the prob. that you will be served in 20 minutes ?

83

Assumption Revisited

Markov Property:

Inter-arrival has to exponential distributed

Arrival and service time

Steady-State Probability

Flow Balance Rate in == Rate Out

Solve a set of linear equations

A large population n, each one has a small

percentage p of entering a store, When n is large, p

is small, exponential is a good approximation.

84

Assumption Revisited

Service time: Exponential ?

Hair Cut: Might not be an exponential now

Markovian Property does not hold any.

Might not be able to use the rate in = rate out

principle

It will be much difficult to get analytical results,

A lot of times, simulation will have to be used.

85

A Machine Repair Example

A factory contains two major machines which fails

independently according to a exponential

distribution with a mean time of 10 per hour

hours and the repair time is distributed according

to a exponential distribution.

Model the problem as a CTMC or a queuing model

and give analytic results.

86

State-Transition Diagram

λ = rate at which a single machine breaks down

= 1/10 hr

=1/8 hr

2λ λ

State-transition diagram:

00 2

1 2

µ µ

87

Balance Equations for Repair Example

µπ1 = 2λπ0

λπ1 = µπ2

We can solve these balance equations for π0, π1 and π2,

but in this case, we can simply use the formulas

that solve general birth-and-death equations:

λn-1 … λ0

Cn = πn = Cnπ0 and π0 = 1 / Σn=0,∞ Cn

µn … µ1

88

Here, λ0 = 2λ µ1 = µ

λ1 = λ µ2 = µ

λ2 = λ

λ0 2λ λ1λ0 2λ2

C1 = µ = µ C2 = µ µ =

1 2 1 µ2

1 2λ

π0 = 2λ 2 λ2 = 0.258 , π1 = µ π0 = 0.412

1+ + 2

µ µ

π2 = 2λ2

π = 0.330

µ2 0

∞

n=s

1 L = 1 1 1

W=— (1.072) Wq = — Lq = 0.0928 (0.33)

λ 0.0928 λ

= 11.55 hours = 3.56 hours

Average amount of time Average amount of time that a

that a machine has to wait machine has to wait until the

to be repaired, including repairman initiates the work.

the time until the repairman

initiates the work.

Proportion of time repairman is busy = π1 + π2 = 0.742

Proportion of time that machine #1 is working

1 1

= π0 + π1 = 0.258 + (0.412) = 0.464

2 2

## Viel mehr als nur Dokumente.

Entdecken, was Scribd alles zu bieten hat, inklusive Bücher und Hörbücher von großen Verlagen.

Jederzeit kündbar.