Your Perfect Assignment is Just a Click Away
We Write Custom Academic Papers

100% Original, Plagiarism Free, Customized to your instructions!

glass
pen
clip
papers
heaphones

Question: Markov chain

Question: Markov chain

Long Question: Markov chain

In many applications, some variables evolve over time in a random (stochastic) way in that even if you know

everything up to time t, it is still impossible to know for sure which values the variables take in the future.

Stochastic processes are mathematical models that describe these phenomena when the randomness is driven

by something out of the control of the relevant decision makers. For example, the stock prices can be (and

usually are) modeled as stochastic processes, since it is difficult for an investor to affect them. However, in a

chess game, the uncertainty in your opponents’ future moves are not modeled as stochastic processes as they

are made by a decision maker with the goal to defeat you and they may be adjusted based on your moves.

One simple and widely used class of stochastic processes are Markov chains. In this question, we study

Markov chains on a finite state space. There is a sequence of random variables x0, x1, …, xt, …, each taking

value in a finite set S called the state space. The subscripts have the interpretation of time. Therefore given

an integer time t, x0, …, xt are assumed to be known at the time, while xt+1, xt+2, … remain random. For

convenience, we label states with positive integers: S = {1, …, n}, where n is the number of possible states.

(a) Is S a good choice of sample space for describing this stochastic process? No matter what the answer is,

for the rest of the question we assume that a good sample space ? (which might be S if your answer is

“yes”) has been chosen to carry the random variables.

(b) The fundamental assumption of Markov chains is that given x0, …, xt, the probability that xt+1 = j is pji

where i is the value of xt. This holds for every t = 0, 1, … and the numbers pji are independent of t. More

precisely, for any fixed t and the event A that x0 = a0, …, xt = at (where a0, a1, …, at are integers in S),

P(x?1t+1(j) ? A) = P(A)pj,at; this holds for all a0, a1, …, at. (The so-called Markov property means that

the value of xt+1 only depends on xt and not on history further back.)

Let P be the n × n matrix whose (j, i) entry is pji for all i, j ? S; it is called the transition matrix of the

Markov chain. Show that for the probabilities to be well-defined for all laws of x0, each column of P sums

up to one.

(c) A law on the state space S can be represented by an n × 1 matrix (column vector), whose (i, 1) entry is

the probability of {i}. A function f : S ? R can be represented by a 1 × n matrix whose (1, i) entry is

the number f(i). (Notice that f is not a random variable unless your answer to Part (a) is “yes”.) If the

law of x0 is v0, then what is the interpretation of fv0 (matrix product)?

(d) In what follows, fix a law v0 of x0 and a function f : S ? R. What is the law of x1?

(e) Notice that fP is a 1×n matrix, so it represents a function on S. What does that function mean intuitively?

(As we can s

Order Solution Now