Deep Neural Nets (Deterministic) vs. Deep Probabilistic Models (Stochastic)

by allenlu2007

The following information is from Marc Ranzato Deep Learning IPAM summer school 2012 presentation (IPAM 2012).

前文已述:  Neural Net 是 deterministic optimization.   比較直觀且 computational straightforward (但可能 stuck 在 local min/max).

Deep Probabilistic Model or Deep Belief Net 則是 probabilistic model based on DAG.  比較 general 但不是非常直觀。且由於 explaining away, 收歙可能非常慢。 需要做一些 treatment 才能加速 (改為 symmetry bi-direction graph based on Hinton).  可以參考 Hinton’s talk.

Hinton 解釋在 DAG learning through hierarchy 時,基本上相當於 bi-direction Boltzmann Machine.

在 Ranzato talk, 他從自己的觀點比較 Deep Neural Net vs. Deep Belief Net (or Probabilistic Model) 如下:

Deep Neural Net:  適合一體成型整合 nets, 因為 computational efficient 而且可以做 end-to-end learning! Google use it because of (i) computational efficiency; (ii) scalable.

Deep Belief Net:  必須 break down to module.  雖然 computational not as fast, 但可以分開 train, 最後 combine.  如果問題本身是 uncertainty (weather? finance?) 可能比較適合。自動產生 xx bar (confidence interval)? 

NewImage

 

Ranzato talk shows SGD (1st order method) with adagrade can beat LFBG (2nd order method, like Newton method)!

 

Google use Deep Neural Net because of computational efficiency.

Advertisements