Search for question
Question

. A player plays a game in which, during each round, he has a probability 0.45 of winning$1 and probability 0.55 of losing $1. These probabilities do not change from

round to round, and the outcomes of rounds are independent. The game stops when either the player loses its money, or wins a fortune of $M. Assume M = 4, and the player starts the game with $2. (a) Model the player's wealth as a Markov chain and construct the probability transition matrix (b) What is the probability that the player goes broke after 2 rounds of play?

Fig: 1

Fig: 2

Fig: 3