Hinton Deep Learning 的一些觀念

by allenlu2007

 

Hinton talk and paper

傳統 neural network 的 learning (forward prop and back prop) based on DAG (directed acyclic graph) 最大的問題:

1. Slow  –>  Can be relaxed by parallel computing, but still slow

2. 容易 stuck 在 local min/max

 

近年來的進步

1. convert deterministic optimization to probabilistic model, still DAG    

2. Visible (leaf node) –> Hidden state (hidden node) with fixed weight: inference

3. With known visible and hidden state –> adjust weight :  learning

4. DAG 做 inference 最大的問題是 “explaining away” effect.   很難找到 DAG belief 什麼,非常 slow

5. 解決方法是把 DAG 改成 symmetry (bi-direction).  Explaining away 消失,很快可以找到 belief 什麼

6. 代價是更複雜的 prior, but can be handle.  另外是 maximization 對應非 MLE.  但結果 good enough.

7. 可以解決 local max/min 問題嗎 ?  (due to explaining away?)

 

8.  Belief net 最後引入的 bipart graph 似乎和解 turbo code 一樣 (or LDPC).  都是 classification problem?

 

Deep learning to decode turbo code or LDPC?

or this is auto encode?

可以用 HMM or LDPC code as an example?

 

Advertisements