### Maximum Likelihood Estimator Examples

#### by allenlu2007

前文討論 ML estimator 的好處和缺點。本文主要討論一些例子

To judge a ML estimator is efficient (CRLB), we can use the following theorem.

## Example 1: 丟銅板 (Binomial distribution).

### 如果 flip a coin 10 times and get 7 heads. What’s the ML estimation of the probability of heads?

The ML estimator is unbiased!

Is it efficient?

I(p) = N/p(1-p) g(n) = n/N ==> It is efficient

where

ML estimator is efficient!

Apply the other theory to check!!

## Example 1b: Repeat K trial of binomial, still unbias and efficient

## Example 2: Poisson random variable arrivial rate

### score = T/a (n/T –a ) I=T/a g=n/T

## Example 2b: Multivariate Poisson random variable arrivial rate

It seems there is no close form for the above equation. We can use iterative algorithm to get the answer. It turns out to be EM (expectation maximization) algorithm; or Richardson-Lucy algorithm in optical literature.

## Example 3: Independent samples of a normal distribution

## Example 3a: Mean is unknown, variance is known

score = 1/sig^2 (sum(xj) – Nu) = N/sig^2( sum(xj)/N – u)

I = N/sig^2 g = sum(xj)/N

It is unbiased and efficient!!

## Example 3b: Mean and variance are unknowns

The mean is unbiased but the variance is biased!! Therefore, the ML estimator is biased!! But it is asympotically unbiased!

## Example 3C: multivariate Normal distribution, mean an arbitrary function of theta

在這情況下，ML estimator 就變成 weighted minimum least squares function.

## Example 3D: multivariate Normal distribution, mean a linear function of theta

更簡化版本假設 samples (+noise) are i.i.d.