Probabilistic Graphic Model (PGM), HMM, and Kalman Filter

by allenlu2007

前文介紹 Hidden Markov Model (HMM) 以及 Kalman filter 是一個特例。

(Linear dynamic state space Gaussian model).  本文以 PGM 觀點重新 review HMM 以及 Kalman filter.


PGM 一般分為 directed graph (有方向性) 和 undirected graph (無方向性).  

Bayesian Network (BN) 是 directed acyclic graphic (DAG), 意即有方向性但沒有 loop 的 network.

特別是 Dynamic Bayesian Network (DBN), Hidden Markov Model 是其中的特例。

PGM –> BN (DAG) –> { DBN (2TBN: 2 time slice BN) –> HMM }


DBN 以及 HMM 的結構 enables recursive Bayes filtering, 只有細節上的差異。可以參考前文或 wiki.

可以直接推導 Kalman filter, particle filter, etc.


那 BN or other PGM (如 Random Markov Field) 到底有什麼用? 似乎不能得到更 general 的結果。

Answer: DBN or HMM 因為其特殊的 time unroll 結構才會有 recursive 的公式。In general BN 的 algorithm 並不會得到 recursive 的公式。

那 BN 有什麼 algorithm 可以用來做什麼 calculation?

1. independence check?

2. Conditional probability inference or MAP inference (by variable elimination or dynamic programming or optimisation)  (exact)

3. Belief propagation (approximate)

3. Sampling of inference 如 MCMC (statistical)


以上需要修正。Belief propagation on Clique tree or junction tree 可以得到 HMM or Kalman tree 公式。

Belief propagation on a chain with forward and backward propagation 應該可以得到 HMM 公式。 

Variable of Eliminiation == Belief propagation on a clique tree!!


More concrete:

Shafer Shenoy for HMM





Belief propagation: with MAP inference: Viterbi algorithm?


BP provides exact solution when there are no loops in graph! (E.g. chain, tree.)
Equivalent to dynamic programming/
Viterbi in these cases.
Otherwise, “loopy” BP provides approximate (but often good) solution.


In a chain case, the following algorithms are equivalent

Belief propagation

Viterbi algorithm




Loopy BP is good or bad?