SP-Fall2015-HW6.pdf

In the name of GOD.
Sharif University of Technology
Stochastic Processes CE 695 Fall 2015 Dr. H.R. Rabiee
Homework 6 (Markov Chains and Hidden Markov
Model)
(100+10 points)
1. (10 pts) Consider the Markov chain given in figure ??.
Figure 1:
(a) Find the transition probability matrix P.
(b) Under which conditions, this chain is irreducible and aperiodic?
(c) Compute the limiting probability vector π.
(d) Find the average recurrence time of state 2.
(e) Find values for p and α for which the limiting vector is uniform
(π1 = π2 = π3 );
(f) Give an interpretation to the obtained results.
2. (10 pts) We have a stochastic process Yn (Yn =
n
∑
Xi ), made by flipping
i=0
coins at independent experiments X. The probability that the coin lands
on heads is P r[X = +1] = 0.6. Compute the followings:
(a) Show that Yn is a Markov Chain (Given n ⪰ 0 and initial condition
Y0 = 0).
(n)
(b) n-step transition probabilities matrices Pi,j for all n.
(c) n-step state probabilities π(n).
1
(d) steady state vector π.
3. (5 pts) Three tanks fight a three-way duel. Tank A has probability 12 of
destroying the tank at which it fires, tank B has probability 31 of destroying
the tank at which it fires, and tank C has probability 16 of destroying the
tank at which it fires. The tanks fire together and each tank fires at the
strongest opponent not yet destroyed. Form a Markov chain by taking as
states the subsets of the set of tanks. Find the expected number of steps
before the chain is absorbed.
4. (5 pts) Consider a complete graph with n vertices (a graph in which there
is an edge between each pair of vertices). A person starts his random walk
from vertex 1. At each step, he moves to one of the adjacent vertices with
1
same probability n−1
. If S is the R.V. which represents the time it takes
to visit all vertices, find E[S].
5. (6 pts) Prove the following theorems:
(a) In an irreducible Markov chain, either all states are transient, all
states are recurrent null, or all states are recurrent positive.
(b) In a finite-state Markov chain, all recurrent states are positive, and
it is impossible that all states are transient. If the Markov chain is
also irreducible, then it has no transient states.
(c) In a Markov chain, the states can be divided, in a unique manner,
into irreducible sets of recurrent states and a set of transient states.
6. (10 pts) We define a threshold queue with parameter T as follows: When
the number of jobs is < T , then the number of jobs decreases by 1 with
probability 0.4 and increases by 1 with probability 0.6 at each time step.
However, when the number of jobs increases to > T , then the reverse
is true, and the number of jobs increases by 1 with probability 0.4 and
decreases by 1 with probability 0.6 at each time step, as shown in figure ??
Figure 2:
(a) Assume that the limiting probabilities exist. Use the stationary equations to derive the limiting probability distribution as a function of
T , for arbitrary threshold T .
2
(b) Compute the mean number of jobs, E[N ], in a threshold queue as a
function of T .
(c) What happens to E[N ] when T = 0? Does this answer make sense?
7. (10 pts) A child jumpes around three vertices of a triangle playground.
For each vertices of this playground, he jumpes with probability pi in
clockwise and with probablity qi = 1 − pi in counterclockwise.
(a) Find the proportion of time that he is at each vertex.
(b) How often does he make a counterclockwise move followed by 5 consecutive clockwise moves?
8. (Extra point: 10 pts) In chess, a rook can move either horizonally within
its row (left or right) or vertically within its column (up or down) any
number of squares. In an 8 8 chess board, imagine a rook that starts at
the lower left corner of a chess board. At each move, a bored child decides
to move the rook to a random legal location (assume that the move cannot
involve staying still). Let T denote the time until the rook first lands in
the upper right corner of the board. Compute E[T] and Var(T).
9. (4 pts) In each of the following items, is HMM a suitable tool for modeling
the data? If it is suitable, determine observations and probable latent
variables in each step:
(a) Weather conditions data (temperature, ...) in days of a year.
(b) Instances of hand-writen digits which are observed by a light pen,
(c) Stocks price of companies in a day.
(d) The stream of transactions of a bank which may be fraudulent or
not.
10. (10 pts) Consider an HMM model shown below figure. ??.
ω0 is an absorbing state and v0 is a unique symbol emitted at this state.
(a) Suppose that the initial state at t = 0 is ω1 . Starting from t = 1,
what is the probability that the HMM generates the sequence V =
v2 , v1 , v0 ?
(b) Given the above sequence V , what is the most probable sequence of
states?
11. (10 pts) We have m baskets and M balls distributed among them. Reaptly,
we choose one of the balls randomly and placed at one of the other m − 1
baskets, randomly. Therefore, we have a Markov chain whose states are
the number of balls in each of m baskets.
(a) Find the limiting probabilities for this Markov chain.
(b) Show that Markov chain is time reversible.
3
Figure 3:
12. (10 pts) Suppose you were locked in a room for several days, and you were
asked about the weather outside. The only piece of evidence you have is
whether the person who comes into the room carrying your daily meal is
carrying an umbrella or not. Let’s suppose the probabilities of figures ??
and ??. Suppose the day you were locked in it was sunny. The next day,
the caretaker carried an umbrella into the room. Assuming that the prior
probability of the caretaker carrying an umbrella on any day is 0.5, what’s
the probability that the second day was rainy?
Figure 4: Transition probabilities
Figure 5: Observation probabilities
13. (10 pts) MaTLAB Programming: It is well known that a DNA sequence is a series of components from {A; C; G; T }. Now let’s assume
4
there is one hidden variable S that controls the generation of DNA sequence. S takes 2 possible states S1; S2. For this HMM name M we
have:
*Transition probabilities:
P (S1 |S1 ) = 0.8, P (S2 |S1 ) = 0.2, P (S1 |S2 ) = 0.2, P (S2 |S2 ) = 0.8)
*Emission probabilities:
P (A|S1 ) = 0.4, P (C|S1) = 0.1, P (G|S1 ) = 0.4, P (T |S1 ) = 0.1, P (A|S2 ) =
0.1, P (C|S2 ) = 0.4, P (G|S2 ) = 0.1, P (T |S2 ) = 0, 4
*Start probabilities:
P (S1 ) = 0.5, P (S2 ) = 0.5
We observed the sequence x = CGT CAG. Answer the following questions
by coding in matlab and report the results in your assigment file. You
should send your codes and report files.
(a) Find P (x|M ) using Forward algorithm.
(b) Find P (πi = S1 |x, M ) for i = 1, .., 6. (Hint: Use Backward algorithm)
(c) Find the most likely path of hidden states using Viterbi algorithm.
5