Consider the Markov chain in Example 3.10.2 with initial probability vector v = (1/2, 1/2).
a. Find the probability vector specifying the probabilities of the states at time n = 2.
b. Find the two-step transition matrix. |

New search. (Also 1294 free access solutions) |