Week 10 - HMM

advertisement
CS188 Discussion Note Week 10: Hidden Markov Model
By Nuttapong Chentanez
Goal: Want to reason about a sequence of observation
- Speech recognition, Robot localization, Medical monitoring,
Human motion synthesis
- An HMM is
- Initial Distribution: P(X1)
- Transition: P(X|X-1)
- Emissions: P(E|X)
- Independences: 1. future depend only on present
2. Observation depend only on current state
Example
Markov model
- A chain-structured BN
- Each node has the same CPT (stationarity)
- Value of node is called state
- Need P(X1) and P(Xa|Xa-1)
- Future and past are independent given present
Mini-forward algorithm
- What is a probability of being state x at time t?
- Brute force,
- Better way, cache the value
Say, P(R0) = <0.5, 0.5>
Filtering/monitoring – computing belief state
-Given current belief state, how to update with evidence
Example
P(R2 | U1, U2)
- From initial observation of sun
- The longer we simulate, the more the uncertainty accumulates.
- For most chain, will end up with a stationary distribution
regardless of initial distribution
- Therefore, can use Markov Chain to predict for a short time
Hidden Markov Model(HMM)
-Markov Chain not so useful in practice, because there is no
observation
-HMM is a Markov Chain with observed output (effects) at each
time step
Viterbi Algorithm
- What is the most likely state sequence given observations?
- Could brute-force but slow
Example
- Observation [true, true, false, true, true]
Download