Sie sind auf Seite 1von 14

Operational Research

Prabhat Mittal
profmittal@yahoo.co.in

Prabhat Mittal
profmittal@yahoo.co.in

Markov Processes

Prabhat Mittal
profmittal@yahoo.co.in

Introduction
Markov Process (Chain) is a stochastic (probabilistic) process which has the property that the probability of a transition from a given state to any future state is dependent only on the present state and not on the manner in which it was reached. Markov processes has become a versatile tool for solving some of the management problems specially in the area of marketing .
Prabhat Mittal
profmittal@yahoo.co.in

Management Applications

It is widely used in examining and predicting the behavior of consumer in terms of their brand loyalty and their switching patterns to other brands. Markov process has also been employed in the study of equipment maintenance and failure problems, analyzing accounts receivable that will ultimately become bad debts. It is also useful in the study of stock market price movements.

Prabhat Mittal
profmittal@yahoo.co.in

Basic concepts of Markov Process

Markov Process is a sequence of n experiments in which each experiment has m possible outcomes a1, a2, , am. Each individual outcome is called a state and the probability that a particular outcome occurs depends only on the probability of the outcome of the preceding experiment. The probability of moving form one state to another or remaining in the same state in a single time period is called transition probability. Because of the probability of moving to one state depends on what happened in the preceding stat e, the transition probability is a conditional probability.
5

Prabhat Mittal
profmittal@yahoo.co.in

Transition Matrix

The transition probabilities a1 a2 am can be arranged in a matrix a1 p11 p12 ... p1m form and such a matrix is a2 p 21 p22 ... p2 m called a one-step transition P = . . . . . probability matrix, denoted am pm1 pm 2 ... pmm by P: The matrix P is a square matrix whose element is a m non-negative probability 1 p ij = 1; i = 1, 2 ,..., m and sum of the elements of j= each row is unity and 0 pij 1
6

Prabhat Mittal
profmittal@yahoo.co.in

Transition Matrix ctd.

In general, any matrix P whose elements are non-negative and sum of the elements either in each row or in column is one is called a transition matrix or a stochastic matrix or a probability matrix. Thus, a transition matrix is a square stochastic matrix (since no. of rows = no. of columns) and therefore it gives the complete description of the Markov process.
7

Prabhat Mittal
profmittal@yahoo.co.in

Transition Diagram

The transition probabilities can also represented by two types of diagrams: 0.7 0.3 0
T= 0.3 0.4 0.3 0 0.3 0.7

transition diagram, which shows the transition probabilities or shifts that can occur in any 0.7 particular situation.

0.4 0.3 x1 x2 0.3 0.3 x3 0.3 0.7

The arrows form each state indicate the possible states to which a process can move from the given state. A zero element in the above matrix indicates that the transition is impossible
Prabhat Mittal
profmittal@yahoo.co.in

Probability tree diagram


0.7 T= 0.3 0 0.3 0.4 0.3 0 0.3 0.7

Probability tree diagram, 0.3 x2 0.4 .3 x1 x2 x3 x3 0 0.3 .7 x1 x2 x3

0.7 x1 0 0.3

x1 x2 x3

Prabhat Mittal
profmittal@yahoo.co.in

Example-I
Two manufacturers A and B are competing with each other in a restricted market. Over the year, As customers have exhibited a high degree of loyalty as measured by the fact that customers using As product 80 percent of the time. Also former customers purchasing the product from B have switched back to As 60 percent of the time
(a)Construct

and interpret the state transition matrix in terms of retention and loss & retention and gain (b)Calculate the probability of a customer purchasing As product at the end of the second period.
Prabhat Mittal
profmittal@yahoo.co.in

10

Example 1 Solution
Transition Matrix
A
Present purchase (n =0)
Next Purchase (n=1)

0.80 0.60

0.20 0.40

P(A1|A0)= prob. Customer using A at present will use A in future P(B1|A0)= prob. Customer using A at present will use B in future

Retention & gain

Retention & Loss


0.4 B 0.6

Transition Diagram

0.8

0.2 A

Probability tree diagram A


Prabhat Mittal
profmittal@yahoo.co.in

0.8 0.2

A B B

0.6 0.4

A B
11

Example 1 Solution
Transition Matrix 2nd Stage
A
Next Purchase (n=2)

0.76 0.72

0.24 0.28

P2 = P.P

Present purchase (n =0)

Transition Diagram

0.76

0.24 A

0.28 B 0.72

Probability tree diagram A


Prabhat Mittal
profmittal@yahoo.co.in

0.8 0.2

A B B

0.6 0.4

A B
12

Steady State (Equilibrium) Conditions

Given that the process being modeled as a Markov Process has certain properties, it is then possible to analyze its long run behavior to determine the probabilities of outcomes after steady state conditions have been reached. For example, after the Markov Process has been in operation for a long time period a given outcome will tend to occur a fixed percent of time. However, for a markov chain to reach steady-state conditions, the chain must be

ergodic regular
13

Prabhat Mittal
profmittal@yahoo.co.in

Steady State (Equilibrium) Conditions

An ergodic Markov chain has the property that it is possible to go from one state to any other state, in a finite number of steps, regardless of the present state. A regular chain is a special type of ergodic chain. It is defined as a chain having a transition Matrix P that for some power of P, it has only non-zero positive probabilities. Thus all regular chains must be ergodic chains but not all ergodic chains must be regular. The easiest way to check if an ergodic chain is regular is to continue squaring the transition matrix P until all zeros are removed.
14

Prabhat Mittal
profmittal@yahoo.co.in

Example-II
Determine if the following transition matrix is ergodic Markov chain
Future states 1 2 3 4

1
Present purchase (n =0)

1/3 1/3 0 0 0 0

0 1/3

1/3 2/3

2 3 4

Prabhat Mittal
profmittal@yahoo.co.in

15

Example-II (Solution)
Check, if it is possible to go from one state to all other states and back. Like from state 1, it is possible to go directly to every other state except state 3. For state 3, it is possible to reach from state 1 to 2 to 3. Therefore, it is possible to go from state 1 to any other state. similarly, from state 2, it is possible to go to state 3 & 4 but not 1. to reach 1 state 2 to 3-1. Also from state 3, state 1 can directly be approached. Finally, from state 4, it is possible to go to state 3, then state 3 to state 1. Hence, the above transition matrix is an ergodic markov chain since we have shown that is possible to go from state 1 to all other states, and to go from all other states to state 1.
Prabhat Mittal
profmittal@yahoo.co.in

16

Example-III
Determine if the following transition matrix is ergodic and regular Markov chain. X is some positive probabilities. Future states
1 2 3 4

1
Present purchase (n =0)

0 0

0 0

0 0

0 0

2 3 4

Prabhat Mittal
profmittal@yahoo.co.in

17

Example-III (Solution)
1 2
P=

0 0

0 0

0 0

0 0
P 2=

1 2 3 4
1 2
P8 =

0 0
0 0

0 0
0 0

0 0
0 0

0 0
0 0

3 4

1 2
P4 =

0 0

0 0

0 0

0 0

3 4

3 4

Note that P raised to an even numbered power gives the result as above, while P raised to an odd-numbered power will give the original matrix. Since all the elements are not nonzero positive elements, therefore the above given matrix is not regular But it is ergodic since it is possible to go from state 1 to state 2 or state 3 from state 2 to state 1 or 4. from state 2 to state 1. From state 3 to state 1. From state 4 to state2 or state 1.
Prabhat Mittal
profmittal@yahoo.co.in

18

Example-IV
The number of units of an item that are withdrawn from inventory on a day-to-day basis is a Markov chain process in which requirements for tomorrow depend on todays requirements. Tomorrow
5 10 12

5
Today

0.6 0.4 0.0 0.3 0.3 0.4 0.1 0.3 0.6

10 12

a. Develop a two day transition matrix b. Comment on how a two day transition matrix might be helpful to a manager who is responsible for inventory management. Prabhat Mittal
profmittal@yahoo.co.in

19

Example-IV (Solution)
Two day transition matrix
5
P(2) = P.P=

Tomorrow 5 10 12

0.48 0.31 0.21

0.36 0.16 0.33 0.36 0.31 0.48

10 12

Imagine that each morning a manager must place an order for inventory replenishment. As a result of delivery time requirements, an order placed today arrives 2 days later. The 2 day transition matrix can be used for guiding ordering decisions. For e.g. If today the manager experiences a demand for 5 units, then 2 days later (when replenishment stock arrives in response to todays order), the probability of needing five units is 0.48, ten units is 0.36 and that twelve units is 0.16
Prabhat Mittal
profmittal@yahoo.co.in

20

Example-V
A manufacturing company has a certain piece of equipment that is inspected at the end of each day and classified as just overhauled, good, fair or inoperative. If the item is inoperative it is overhauled, a procedure that takes one day. Let us denote the four classifications as states 1,2, 3 and 4 respectively. Assume that the working conditions of the equipment follows a Markov processes with the following transition matrix.
1 2
P= Today

1 0
0 0 1

Tomorrow 2 3 4 0
0 0 0 0 0

3 4

Prabhat Mittal

If it costs Rs.125 to overhaul a machine, on the average, and Rs.75 in production is lost if a machine is found inoperative. Using the steady state probabilities, compute the expected per day cost of maintenance.
21

profmittal@yahoo.co.in

Example-V (Solution)
The given matrix P represents an ergodic regular Markov process, therefore it shall reach to steady state equilibrium. Let p1, p2,p3,and p4 be steady state probabilities representing the proportion of times that the machine will be in states 1,2 3 and 4 respectively. Using the steady-state equation R= RP
0 0
(p1, p2,p3, p4) = (p1, p2,p3, p4 )

0 0

0 0 0

0 1

Prabhat Mittal
profmittal@yahoo.co.in

22

Example-V (Solution)
Solving the steady state probability, requires solving the simultaneous equations. p1 = p4 p2 = p1+ p2 p3 = p1+ p2 + p3 p4 = p3 And p1+ p2 + p3 + p4 =1 Solving the above equations we obtain the steady state prob. as p1= 2/11, p2 = 3/11, p3 = 4/11 and p4 =2/11. Thus, on an average, two out of every 11 days the machine will be overhauled, three out of every 11 days it will be in good condition, four out of every 11 days it will be in fair condition and two of every 11 days it will be found inoperative at the end of the day. Hence the average cost per day of maintenance will be =2/11*125 + 2/11*75 = Rs.36.36
Prabhat Mittal
profmittal@yahoo.co.in

23

Excel Worksheet

Prabhat Mittal
profmittal@yahoo.co.in

24

Example-VI
There are three dairies A, B and C in a small town which supply all the milk consumed in the town. Assume that the initial consumer sample is composed of 1000 respondents distributed over three dairies A, B and C. it is known by all the dairies that consumer switch from one dairy to another due to advertising, price and dissatisfaction. All these dairies maintain records of the number of their customers and the dairy from which they obtained each new customer. The following table illustrates the flow of customers over an observation period on one month. Assume that the matrix of transition probabilities remains fairly stable and at the beginning of period one, market shares are A = 25%, B = 45% and C = 30%. Construct the state transition probability matrix to analyze the problem.
Dairy Period one No. of customers 250 450 300 Changes during the period Gain A B
Prabhat Mittal

Period two No. of customers 262 443 295


25

Loss 50 60 55

62 53 50

profmittal@yahoo.co.in

Example-VI (Solution)
To determine the retention probabilities (Probability to remain in the same state), the customers retained for the period under review are divided by the number of customers at the beginning of the period.
Dairy Period one No. of customers Changes during the period Number lost A B C 250 450 300 50 60 55 Number retained 200 390 245 200/250=0.80 390/450=0.86 245/300=0.81 Probability of retention

Prabhat Mittal
profmittal@yahoo.co.in

26

Example-VI (Solution) ctd.. ctd


To determine the gain and loss probabilities, it is necessary to show gains and losses among the dairies in order to complete the matrix of transition probabilities. (Probability to remain in the same state), the customers retained for the period under review are divided by the number of customers at the beginning of the period.
Dairy Period one No. of customers A A B C 250 450 300 0 25 25 Gain from B 35 0 25 C 27 28 0 A 0 35 27 Loss to B 25 0 28 C 25 25 0 262 443 295 Probability of retention

Prabhat Mittal
profmittal@yahoo.co.in

27

Example-VI (Solution) ctd.. ctd

The first row shows that dairy A retains 80% of its customers, losses 10% (25) of customers to dairy B and losses 10% (25) to dairy c The second row shows that dairy B losses 8% to A, retains 86.6% of its own customers and losses 5.5% to dairy C The third row shows that dairy C losses 9% of its customers to dairy B and retains 81% of its own customers
A Matrix of transition probabilities B C

Retention and gain

A
Dairies

200/250=0.80 35/450=0.08 27/300=0.09

25/250=0.10 390/450=0.86 28/300=0.093

25/250=0.10 25/450=0.055 245/300=0.81

B C

Retention and loss


Prabhat Mittal
profmittal@yahoo.co.in

28

Das könnte Ihnen auch gefallen