Ordinary Linear Regression vs. Bayesian Linear Learning

by allenlu2007

 

Discriminative vs. generative learning

The distribution p(y|x) is the natural distribution for classifying a given example x into a class y, which is why algorithms that model this directly are called discriminative algorithms. Generative algorithms model p(x,y), which can be tranformed into p(y|x) by applying Bayes rule and then used for classification. However, the distribution p(x,y) can also be used for other purposes. For example you could use p(x,y) to generate likely (x,y) pairs.

From the description above you might be thinking that generative models are more generally useful and therefore better, but it’s not as simple as that. This paper is a very popular reference on the subject of discriminative vs. generative classifiers, but it’s pretty heavy going. The overall gist is that discriminative models generally outperform generative models in classification tasks.

 

The following is from Bishop video lecture in MLSS 2009?

Regression is discriminative learning (?)

Pro: simple; sometime ignore irrelevant information

Con: lost some information

 

Bayesian Learning (generative ?)

Find the likelihood function

Find the conjugate prior

Find the posterior distribution

 

Pro: complicated in computation

Con: More information to know the posterior

 

Can do more like: 

loss function?

rejection

unbalance ? (to correct the prior)

and ??

 

 

Advertisements