Maximum Likelihood Estimator Examples

by allenlu2007

前文討論 ML estimator 的好處和缺點。本文主要討論一些例子

image

To judge a ML estimator is efficient (CRLB), we can use the following theorem.

[image%255B62%255D.png]

 

Example 1: 丟銅板 (Binomial distribution).

如果 flip a coin 10 times and get 7 heads.  What’s the ML estimation of the probability of heads?

image

image

image

image

image

The ML estimator is unbiased!

Is it efficient? 

\frac{n}{p}-\frac{N-n}{1-p} = \frac{n-Np}{p(1-p)}

I(p) = N/p(1-p)  g(n) = n/N  ==>  It is efficient

image

where E(n^2)=N^2 p^2 + Np(1-p)

image

ML estimator is efficient!

Apply the other theory to check!!

 

Example 1b: Repeat K trial of binomial, still unbias and efficient

 

Example 2: Poisson random variable arrivial rate

image

image

score = T/a (n/T –a ) I=T/a g=n/T

 

Example 2b: Multivariate Poisson random variable arrivial rate

 

image

image

image

image

image

It seems there is no close form for the above equation.  We can use iterative algorithm to get the answer.  It turns out to be EM (expectation maximization) algorithm; or Richardson-Lucy algorithm in optical literature.

image

 

 

 

Example 3: Independent samples of a normal distribution

image

 

Example 3a:  Mean is unknown, variance is known

image

score = 1/sig^2 (sum(xj) – Nu) = N/sig^2( sum(xj)/N – u) 

I = N/sig^2  g = sum(xj)/N

It is unbiased and efficient!!

 

Example 3b: Mean and variance are unknowns

image

The mean is unbiased but the variance is biased!!  Therefore, the ML estimator is biased!!   But it is asympotically unbiased!

 

Example 3C:  multivariate Normal distribution, mean an arbitrary function of theta

 

image 

image

在這情況下,ML estimator 就變成 weighted minimum least squares function.

 

Example 3D: multivariate Normal distribution, mean a linear function of theta

image

更簡化版本假設 samples (+noise) are i.i.d. 

image

Advertisements