Skip to content

Latest commit

 

History

History
11 lines (7 loc) · 983 Bytes

mle_estimation_of_the_clm.md

File metadata and controls

11 lines (7 loc) · 983 Bytes

Reading: Maximum Likelihood Estimation of the CLM

Here is a brief (3 pages) demonstration of how Maximum Likelihood produces the same estimates as the standard, OLS regression estimator that we have used to this point. We wouldn't expect you to be able to prove or derive this yourself, but we do think that at this point in the semester, you're probably able to read along and follow the argument.

Link to reading.

As you're reading, note:

  1. How challenging a log and derivative does this approach create? Could you take this yourself?
  2. Are you surprised to notice that we arrive at the same estimator from the MLE perspective as from the OLS perspective?
  3. The standard errors of the MLE estimates and the OLS estimates are slightly different. Why do you think this is? Where do you see an appeal to the idea of convergence in probability? What is the consequence of this convergence in probability?