LEGACY CONTENT. If you are looking for Voteview.com, PLEASE CLICK HERE

This site is an archived version of Voteview.com archived from University of Georgia on May 23, 2017. This point-in-time capture includes all files publicly linked on Voteview.com at that time. We provide access to this content as a service to ensure that past users of Voteview.com have access to historical files. This content will remain online until at least January 1st, 2018. UCLA provides no warranty or guarantee of access to these files.

45-733 PROBABILITY AND STATISTICS I Notes #7A


February 2000



Maximum Likelihood Estimation

  1. In parametric estimation we assume that we know the type of distribution (e.g, Normal, Poisson, Bernoulli, etc.) from which our random sample is drawn, and on the basis of that random sample we must infer the values of the parameters of the distribution. For example, we take a random sample from a Poisson distribution and on the basis of the random sample, we decide what the value of l is. In addition, we use the random sample to make statements about how confident we are in our guess about the values of the parameters.
  2. An Estimator is a formula, or a rule, that we use to get values for the parameters. For example, we have an urn with a large number of balls in it. There are only two colors of balls -- green and red -- in the urn. We draw 10 balls with replacement and note their color. Clearly, the best guess about the proportion of green balls in the urn is the number of green balls drawn divided by 10. However, note that this is just the sample mean.

    Technically, an estimator is a real valued funtion of the sample.

  3. Maximum Likelihood Method of Obtaining Estimators
    We need a systematic way of getting estimators. The method of maximum likelihood is a very powerful method of doing so and is strongly intuitive. It is not fool-proof, but for almost all important distributions of interest, it provides us with plausible and useful estimators of the underlying parameters.
  4. The Maximum Likelihood Method has three steps. With respect to the Bernoulli distribution these are:
    1. First: Form the Joint distribution of the sample (the Likelihood function of the sample)

      f(x1 , x2 , ... , xn | p) = f1(x1)f2(x2) f3(x3)...fn(xn) = Pi=1,n fi(xi) =
      px1(1 - p)(1 - x1) px2(1 - p)(1 - x2) px3(1 - p)(1 - x3)... pxn(1 - p)(1 - xn) =
      p(åi=1,n xi)(1 - p) (n - åi=1,n xi)


    2. Second: Take the natural log of the Likelihood function

      L(x1 , x2 , ... , xn | p) = ln{f(x1 , x2 , ... , xn | p)} =
      [åi=1,n xi]ln(p) + (n - åi=1,n xi)ln(1 - p)


    3. Third: Find the maximum by taking the first derivative of the log of the Likelihood function

      L/ p = [åi=1,n xi]/p - (n - åi=1,n xi)/(1 - p) = 0
      = åi=1,n xi - p[åi=1,n xi] - np + p[åi=1,n xi] = åi=1,n xi - np
                 ^   _
         Hence:  p = Xn
      
      

  5. Example: Find the Maximum Likelihood Estimator for l in the Poisson Distribution.

    f(x1 , x2 , ... , xn | l) = [(e-llx1)/x1!] [(e-llx2)/x2!] [(e-llx3)/x3!]... [(e-llxn)/xn!] =
    [(e-nll åi=1,n xi)/(Pi=1,n xi!)]
    L(x1 , x2 , ... , xn | l) = -nl + (åi=1,n xi) ln(l) - ln(Pi=1,n xi!)
    L/ l = -n + (åi=1,n xi)/l = 0

               ^   _
       Hence:  l = Xn
    
    

  6. Example: Find the Maximum Likelihood Estimator for m and s2 in the Normal Distribution.

    f(x1 , x2 , ... , xn | m, s2) = {1/[(2p)1/2 s]} [e-(x1 - m)2/ 2(s)2]
    {1/[(2p)1/2 s]} [e-(x2 - m)2/ 2(s)2] {1/[(2p)1/2 s]} [e-(x3 - m)2/ 2(s)2]...
    {1/[(2p)1/2 s]} [e-(xn - m)2/ 2(s)2] =
    {1/[(2p)n/2 (s2)n/2]} {e-[1/2(s)2] [åi=1,n (xi - m)2]}

    L(x1 , x2 , ... , xn | m, s2) = - (n/2)ln(2 p) - (n/2)ln (s2) - [1/2(s)2] [åi=1,n (xi - m)2]

    L/ m = [1/(s2)] [åi=1,n (xi - m)] = 0
    L/ s2 = -[n/2(s2)] + [1/2(s4)] [åi=1,n (xi - m)2] = 0
    Hence: ^   _  
           m = Xn, Using L/m
    and:   ^                 _  
           s2 = [åi=1,n (xi - Xn)2]/n