CHAPTER 16 Markov Analysis TRUE/FALSE 16.1 The matrix of transition probabilities shows the likelihood that the system will change from one time period to the next. 16.2 In the matrix of transition probabilities, P i j is the conditional probability of being in state i in the future, given the current state j. 16.3 (3) = (1) P P, where P is a matrix of transition probabilities. 16.4 The probabilities in any column of the matrix of transition probabilities will always sum to one. 16.5 The vector of state probabilities for any period is equal to the vector of state probabilities for the preceding period multiplied by the matrix of transition probabilities. 16.6 An equilibrium condition exists if the state probabilities for a future period are the same as the state probabilities for a previous period. 16.7 Equilibrium state probabilities may be estimated by using Markov analysis for a large number of periods. 16.8 The fundamental matrix is a partition of the matrix of transition. 16.9 When absorbing states exist, the fundamental matrix is used to compute equilibrium conditions. 503 Markov Analysis CHAPTER 16 16.10 For any absorbing state, the probability that a state will remain unchanged in the future is one. 16.11 The four basic assumptions of Markov analysis are: 1. 2. 3. 4. Limited or finite number of possible states. Probability of changing states remains the same over time. Future state is predictable from previous state and transition matrix. Size and makeup of system are constant during analysis. 16.12 In Markov analysis, states must be collectively exhaustive and mutually exclusive. 16.13 (n+1) = nP 16.14 In Markov analysis, the row elements of the transition matrix must sum to 1. 16.15 Once in an absorbing state, always in an absorbing state. 16.16 i is called the vector of change probabilities for period i. 16.17 (n+1) = P(n) 16.18 In Markov analysis, if we know the present state vector and the transition matrix, we can determine previous states. 16.19 Once a Markov process is in equilibrium, it stays in equilibrium. 16.20 In Markov analysis, initial-state probability values determine equilibrium conditions. *16.21 Markov analysis assumes that there are a limited number of states in the system. *16.22 Markov analysis assumes that while a member of one state may move to a different state over time, the overall makeup of the system will remain the same. *16.23 The vector of state probability gives the probability of being in a particular state at a particular point in time. 504 Markov Analysis CHAPTER 16 *16.24 The matrix of transition probabilities gives the probability of moving from one state to another. *16.25 Markov analysis help us determine the likelihood of moving from state i to state j, but provides no information as to how we arrived in state i. *16.26 A Markov process could be used as a model as to how a disease progresses from one set of symptoms to another. *16.27 A Markov process could be used as a model within which to view the progress of students from one grade level to another in a college system. *16.28 A Markov model could be used to help one understand the reasons for the population shifts taking place in the world today. *16.29 One of the problems with using the Markov model to study population shifts is that we must assume that the reasons for moving from one state to another remain the same over time. *16.30 All Markov models have an equilibrium state. 505 Markov Analysis CHAPTER 16 *16.31 For most problems, the state probabilities at equilibrium are 0.333 and 0.667. MULTIPLE CHOICE 16.32 Markov analysis is a technique that deals with the probabilities of future occurrences by (a) (b) (c) (d) (e) 16.33 Markov analysis might be effectively used for: (a) (b) (c) (d) 16.34 using the simplex solution method. analyzing presently known probabilities. statistical sampling. the minimal spanning tree. none of the above market share analysis. university enrollment predictions. machine breakdowns. all of the above The following is an assumption of Markov analysis: (a) there is a finite number of possible states (b) the probability of changing states remains the same (c) we can predict any future state from the previous state and the matrix of transition probabilities (d) the size and composition of the system remain constant (e) all of the above 16.35 In Markov analysis, the likelihood that any system will change from one period to the next is revealed by the (a) (b) (c) (d) (e) cross-elasticities. fundamental matrix. matrix of transition probabilities. vector of state probabilities. state of technology. 506 Markov Analysis CHAPTER 16 16.36 (a) (b) (c) (d) (e) 16.37 status quo. stability dependency. market saturation. incidental mobility. an absorbing state. A collection of all state probabilities for a given system at any given period of time is called the (a) (b) (c) (d) (e) 16.39 complementary and collectively exhaustive. collectively dependent and complementary. collectively dependent and mutually exclusive. collectively exhaustive and mutually exclusive. complementary and mutually exclusive. Occasionally, a state is entered which will not allow going to any other state in the future. This is called: (a) (b) (c) (d) (e) 16.38 Markov analysis assumes that conditions are both transition probabilities. vector of state probabilities. fundamental matrix. equilibrium condition. none of the above In a matrix of transition probabilities (where i equals the row number and j equals the column number), (a) each number represents the conditional probability of being in state j in the next period given that it is currently in the state of i. (b) each number represents the probability that if something is in state i, it will go to state j in the next period. (c) the number in row 3, column 3 represents the probability that something will remain in state 3 from one period to the next. (d) the probabilities are usually determined empirically. (e) all of the above 507 Markov Analysis CHAPTER 16 16.40 In a matrix of transition probabilities, (a) (b) (c) (d) 16.41 the probabilities for any row will sum to one. the probabilities for any column will sum to one. the probabilities for any column are mutually exclusive and collectively exhaustive. none of the above In Markov analysis, to find the vector of state probabilities for any period, (a) one should find them empirically. (b) subtract the product of the numbers on the primary diagonal from the product of the numbers on the secondary diagonal. (c) find the product of the vector of state probabilities for the preceding period and the matrix of transition probabilities. (d) find the product of the vectors of state probabilities for the two preceding periods. (e) take the inverse of the fundamental matrix. 16.42 In the long run, in Markov analysis, (a) all state probabilities will eventually become zeros or ones. (b) the matrix of transition probabilities will change to an equilibrium state. (c) generally, the vector of state probabilities, when multiplied by the matrix of transition probabilities, will yield the same vector of state probabilities. (d) all of the above 16.43 In order to find the equilibrium state in Markov analysis, (a) it is necessary to know both the vector of state probabilities and the matrix of transition probabilities. (b) it is necessary only to know the matrix of transition probabilities. (c) it is necessary only to know the vector of state probabilities for the initial period. (d) one should develop a table of state probabilities over time and then determine the equilibrium conditions empirically. (e) none of the above 508 Markov Analysis CHAPTER 16 16.44 In Markov analysis, the absorbing state (a) refers to the condition whereby something in some state cannot go to any other state in the future. (b) refers to the condition whereby something in some state cannot go to one particular other state in the future. (c) means that, for some state, the probability of remaining in that state in the next period is zero. (d) means that, for some state, the probability of leaving that state for the next period is one. 16.45 In Markov analysis, the fundamental matrix (a) (b) (c) (d) is necessary to find the equilibrium condition when there are absorbing states. can be found but requires, in part, partitioning of the matrix of transition probabilities. is equal to the inverse of the I minus B matrix. is multiplied by the A matrix in order to find the probabilities that amounts in nonabsorbing states will end up in absorbing states. (e) all of the above 16.46 If we want to use Markov analysis to study market shares for competitive businesses, (a) (b) (c) (d) it is an inappropriate study. simply replace the probabilities with market shares. it can only accommodate one new business each period. only constant changes in the matrix of transition probabilities can be handled in the simple model. (e) none of the above 16.47 (a) (b) (c) (d) (e) Where P is the matrix of transition probabilities, (4) = (3) P P P. (3) P P. (2) P P P. (1) P P P. none of the above 509 Markov Analysis CHAPTER 16 16.48 The copy machine in an office is very unreliable. If it was working yesterday, there is an 80% chance it will work today. If it was not working yesterday, there is a 10% chance it will work today. What is the probability that it is not working today, if it was not working yesterday? (a) (b) (c) (d) (e) 16.49 The copy machine in an office is very unreliable. If it was working yesterday, there is an 80% chance it will work today. If it was not working yesterday, there is a 10% chance it will work today. What is the probability it will not work today, if it was working yesterday? (a) (b) (c) (d) (e) 16.50 0.16 0.64 0.66 0.80 none of the above The copy machine in an office is very unreliable. If it was working yesterday, there is an 80% chance it will work today. If it was not working yesterday, there is a 10% chance it will work today. If it is not working today, what is the probability that it will be working 2 days from now? (a) (b) (c) (d) (e) 16.52 0.1 0.2 0.8 0.9 none of the above The copy machine in an office is very unreliable. If it was working yesterday, there is an 80% chance it will work today. If it was not working yesterday, there is a 10% chance it will work today. If it is working today, what is the probability that it will be working 2 days from now? (a) (b) (c) (d) (e) 16.51 0.1 0.2 0.8 0.9 none of the above 0.16 0.17 0.34 0.66 none of the above Using the data in Table 16-1, determine Company 1’s estimated market share in the next period. 510 Markov Analysis CHAPTER 16 (a) (b) (c) (d) (e) 16.53 Using the data in Table 16-1, determine Company 2’s estimated market share in the next period. (a) (b) (c) (d) (e) 16.54 0.10 0.20 0.42 0.47 none of the above 0.26 0.27 0.28 0.29 none of the above Using the data in Table 16-1, determine Company 3’s estimated market share in the next period. (a) (b) (c) (d) (e) 0.26 0.27 0.28 0.29 none of the above 511 Markov Analysis CHAPTER 16 16.55 Using the data in Table 16-1, and assuming the transition probabilities do not change, in the long run what market share would Company 2 expect to reach? (Rounded to two places.) (a) (b) (c) (d) (e) *16.56 The weather is becoming important to you since you would like to go on a picnic today. If it was sunny yesterday, there is a 70% chance it will be sunny today. If it was raining yesterday, there is a 30% chance it will be sunny today. What is the probability it will be sunny today, if it was sunny yesterday? (a) (b) (c) (d) (e) *16.57 0.1 0.3 0.7 0.8 none of the above The weather is becoming important to you since you would like to go on a picnic today. If it was sunny yesterday, there is a 70% chance it will be sunny today. If it was raining yesterday, there is a 30% chance it will be sunny today. If the probability that it was raining yesterday is 0.25, what is the probability that it will rain today? (a) (b) (c) (d) (e) *16.59 0.1 0.2 0.7 0.8 none of the above The weather is becoming important to you since you would like to go on a picnic today. If it was sunny yesterday, there is a 70% chance it will be sunny today. If it was raining yesterday, there is a 30% chance it will be sunny today. What is the probability it will be sunny today, if it was raining yesterday? (a) (b) (c) (d) (e) *16.58 0.30 0.32 0.39 0.60 none of the above 0.1 0.3 0.4 0.7 none of the above The weather is becoming important to you since you would like to go on a picnic today. If it was sunny yesterday, there is a 65% chance it will be sunny today. If it was raining yesterday, there is a 30% chance it will be sunny today. If the probability that it was raining yesterday is 0.4, what is the probability that it will be sunny today? 512 Markov Analysis CHAPTER 16 (a) (b) (c) (d) (e) *16.60 Using the data given in Table 16-2, find the market shares for the three retailers in month 2. (a) (b) (c) (d) (e) *16.61 0.650 0.390 0.510 0.490 none of the above (2) = (0.09, 0.42, 0.49) (2) = (0.55, 0.33, 0.12) (2) = (0.18, 0.12, 0.70) (2) = (0.55, 0.12, 0.33) none of the above Using the data given in Table 16-2, what will be the market share of the third retailer 5 years from now? (a) (b) (c) (d) (e) 0.6267 0.2729 0.1504 0.2229 none of the above 513 Markov Analysis CHAPTER 16 The following data is to be used for problems 16.62 – 16.66: Cuthbert Wylinghauser is a scheduler of transportation for the state of Delirium. This state contains three cities: Chaos (C1), Frenzy (C2), and Tremor (C3). A transition matrix, indicating the probability that a resident in one city will travel to another, is given below. Cuthbert’s job is to schedule the required number of seats, one to each person making the trip (transition), on a daily basis. C Transition matrix: *16.62 (1) = [100,100,100] 80 70 20 60 none of the above Tomorrow evening, how many people can we expect to find in each city? (a) (b) (c) (d) (e) *16.64 C .8 .1 .1 F .1 .7 .2 T .2 .2 .6 How many seats should Cuthbert schedule for travel from Chaos to Tremor for tomorrow? (a) (b) (c) (d) (e) *16.63 F T Chaos = 90, Frenzy = 110, Tremor = 100 Chaos = 110, Frenzy = 100, Tremor = 90 Chaos = 80, Frenzy = 90, Tremor = 130 Chaos = 100, Frenzy = 130, Tremor = 70 none of the above Find the equilibrium population for Frenzy (round to the nearest whole person). (a) (b) (c) (d) (e) 126 95 79 100 none of the above 514 Markov Analysis CHAPTER 16 *16.65 During the tenth time period, what percent of the people in Frenzy travel to Chaos? (a) (b) (c) (d) (e) *16.66 0.8 0.1 0.6 0.2 none of the above What is the equilibrium population of Chaos (rounded to the nearest whole person)? (a) (b) (c) (d) (e) 79 95 126 100 none of the above PROBLEMS 16.67 A certain utility firm has noticed that a residential customer's bill for one month is dependent upon the previous month's bill. The observations are summarized in the following transition matrix. This month’s bill change over previous months Increase Same Decrease Next Month’s Change Increase Same Decrease 0.1 0.2 0.7 0.3 0.4 0.3 0.5 0.3 0.2 The utility company would like to know the long-run probability that a customer's bill will increase, the probability the bill will stay the same, and the probability the bill will decrease. 515 Markov Analysis CHAPTER 16 16.68 A certain firm has noticed that employees' salaries from year to year can be modeled by Markov analysis. The matrix of transition probabilities follows. Salary in Current Year Remains Unchanged Receives Raise Salary in Next Year Remains Receives Unchanged Raise 0.2 0.4 0.5 Quits Fired 0.3 0.1 0.0 0.2 0.3 (a) Set up the matrix of transition probabilities in the form: I 0 A B (b) Determine the fundamental matrix for this problem. (c) What is the probability that an employee who has received a raise will eventually quit? (d) What is the probability that an employee who has received a raise will eventually be fired? 16.69 The vector of state probabilities for period n is (0.3, 0.7). The accompanying matrix of transition probabilities is: Calculate the vector of state probabilities for period n+1. 516 0.8 0.2 01 . 0.9 Markov Analysis CHAPTER 16 16.70 Given the following matrix of transition probabilities, find the equilibrium state. 0.8 0.2 0.4 0.6 16.71 Given the following vector of state probabilities and the accompanying matrix of transition probabilities, find the next period vector of state probabilities. (0.2 0.3 0.5) 16.72 0.6 0.2 0.2 0.1 0.7 0.2 0.2 0.3 0.5 There is a 20% chance that any current client of company A will switch to company B this year. There is a 40% chance that any client of company B will switch to company A this year. If these probabilities are stable over the years, and if company A has 400 clients and company B has 300 clients, (a) How many clients will each company have next year? (b) How many clients will each company have in two years? 16.73 Over any given month, Hammond Market loses 10% of its customers to Otro Plaza and 20% to Tres Place. Otro Plaza loses 5% to Hammond and 10% to Tres Place. Tres Place loses 5% of its customers to each of the two competitors. At the present time, Hammond Market has 40% of the market, while the others have 30% each. (a) Next month, what will the market shares be for the three firms? (b) In two months, what will the market shares be for the three firms? 16.74 The fax machine in an office is very unreliable. If it was working yesterday, there is an 90% chance it will work today. If it was not working yesterday, there is a 5% chance it will work today. (a) What is the probability that it is not working today, if it was not working yesterday? (b) If it was working yesterday, what is the probability that it is working today? 16.75 There is a 30% chance that any current client of company A will switch to company B this year. There is a 20% chance that any client of company B will switch to company A this year. If these probabilities are stable over the years, and if company A has 1000 clients and 517 Markov Analysis CHAPTER 16 company B has 1000 clients, in the long run (assuming the probabilities do not change), what will the market shares be? 16.76 Three fast food hamburger restaurants are competing for the college lunch crowd. Burger Bills has 40% of the market while Hungry Heifer and Salty Sams each have 30% of the market. Burger Bills loses 10 % of its customers to Hungry Heifer and 10% to Salty Sams each month. Hungry Heifer loses 5% of its customers to Burger Bills and 10% to Salty Sams each month. Salty Sams loses 10% of its customers to Burger Bills while 20% go to Hungry Heifer. What will the market shares be for the three businesses next month? SHORT ANSWER/ESSAY 16.77 What does the matrix of transition probabilities show with respect to a system being studied? 16.78 Define what is meant by a state probability. 16.79 Describe the situation of the existence of an equilibrium condition in a Markov analysis. 16.80 Given the following matrix of transition probabilities, write three equations that, when solved, will give the equilibrium state values. a b c d P= 518 Markov Analysis CHAPTER 16 SHORT ANSWER/FILL IN THE BLANK 16.81 List the six assumptions of Markov analysis. 519