Statistics 102H: MLE's (Statistical Inference by S. D. Lilvey
Statistics 102H: MLE's (Statistical Inference by S. D. Lilvey
Administrivia
data analysis due monday
If you have any questions about how your data analysis, include
them with what you turn in and I'll email you an answer.
I wrote up details of what the rest of the projct should entail.
Example 1: binomial (cont'd)
Last time we did the binomial my maximizing f. It was ugly. Lets redo it by maximizing log f.
Example 2: normal theory
Full likelihood is product of individual likelihoods. (Only do mu not
sigma).
God it is ugly!
Take logs. Ah, a bit prettier
Ignore constants. Thats more like it!
Take derivative
Complete the square also: this is where leasts squares comes
from
Calculating the MLE
Book uses D for multi-derivative. we will stick with single
derivatives in class.
Two term taylor series: log likelihood = l(theta0) +
...
DEEP THEOREM (LeCam): This is a VERY good approximation.
Almost exact
up to several standard errors. (This is what I studied in my
thesis. I didn't prove it. I just tried to understand it.)
So maximum is at -b/2a. Or take dervatives. Or use Newton's
method.
First useful example
Suppose there are outliers in the data
You can handle it using ideas of the first part of hte course
Now lets do it carefully.
Need a distribution that generates outliers: Cauchy
Compute the score
Notice: large values are down-weighted! Just like you do when
you remove them!