Sie sind auf Seite 1von 2

EECS 501 F11 Discussion 12 12/12,13 Markov Chains A discrete-time random process (set of RVs X0 , X1 , X2 , . . .

.) with the Markov Property is a Markov Chain. The Markov Property is: P (Xn+1 = j |Xn = i , Xn1 = ln1 , Xn2 = ln2 , . . . , X0 = l0 ) = P (Xn+1 = j |Xn = i) = pi j , These pi j elements dene a state transition matrix, often denoted P. Also, P (Xn+m = j |Xn = i ) = (P m )i j . Denitions on States A state k is absorbing if pk k = 1 and pk j = 0 if j = k. This means that once in this state, the process stays in this state forever. Let qi denote the probability that, if now in state i, the process will later revisit state i . A state k is recurrent if qk = 1 and transient if qk < 1. Denitions on Groups of States State j is accessible from state i if there is a nonzero probability of eventually reaching state j from i. Two states i , j communicate if i is reachable from j and vice-versa. This creates a partition of a Markov Chain into disjoint sets (which are equivalence classes) where all members communicate with each other, but not with any elements outside the set. An equivalence class is closed if no state outside the class is accessible from any state in the class. A Markov Chain is irreducible if there is only one equivalence class, meaning every state is accessible from every other state. A Markov Chain is indecomposable if there is no more than one closed equivalence class. Denitions on Probabilities An M -state chain has a steady state probability distribution (w1 , w2 , . . . wM ) if
n

lim (Pn )i j = w j ,

i, j .

A distribution (v1 , v2 , . . . , vM ) is a stationary probability distribution if


M

vi 0,

i =1

vi = 1,

PT v = v.

If PN has an all-positive column for some N , the Markov Chain with state transition matrix P is ergodic. If a chain has a steady state probability distribution w, then w is its unique stationary probability distribution, and the chain is indecomposable. A chain has a steady state probability distribution iff it is ergodic. A nite chain will always have a stationary probability distribution, but it is unique iff the chain is indecomposable.

Problem 1 A gambler playing roulette makes a series of one dollar bets. He has respective probabilities 9/19 and 10/19 of winning and losing each bet. The gambler decides to quit playing as soon as his net winnings reach 25 dollars or his net losses reach 10 dollars. a. Find the probability that when he quits playing he will have won 25 dollars. b. Find his expected loss. Problem 2 Consider a rat in a maze with four cells indexed 1, 2, 3, 4, and and exit cell indexed by 0. The rat starts initially in a given cell and then takes a move to another cell, continuing to do so until it reaches freedom. Assume at each move, the rat is equally likely to choose from the neighboring cells, independently of the past. The maze looks like: 1 3 2 4 Exit

a. Construct a Markov chain for this process, and give the matrix of transition probabilities. b. Given that the rat starts initially in cell i , determining the expected number of moves required until the rat reaches freedom. Problem 3 Consider a Markov chain provided by the Ehrenfest model of diffusion. Two containers of the same volume contain a total d = 3 molecules of gas. At each time n = 0, 1, . . . a molecule is selected at random from d molecules and then transfered to the other container. a. Find the matrix of transition probabilities. b. Find the stationary distribution. Problem 4 A queueing system at the door of a club works as follows. At each time n the doorman lets a customer in the club only if at least 3 customers are waiting in line. However, due to noise regulations in the neighborhood, the line cannot be longer than 6 people, so customers are (kindly) asked not to enter the line if this is the case and are sent away. We assume that at each time n either no customer arrives with probability 1 p or one customer arrives with probability p, and arrivals are independent from one time to the next. a. Let Xn denote the queue length at time n. Construct a Markov Chain for this process and give the matrix of transition probabilities. b. Partition the state space into equivalent classes of communicating states and identify the transient states and the recurrent states. Problem 5 Given a Markov Chain {Xn } . Dene T1 ( j ) =min{n 1 : Xn = j } and Tk ( j ) = min{n Tk1 : n=0 Xn = j }, k > 1. That is, Tk ( j ) is the k-th entrance time of state j . Show that P (T2 ( j ) = 5|T1 ( j ) = 2, X0 = i ) = P (T1 ( j ) = 3|X0 = j )

Das könnte Ihnen auch gefallen