📊

FDSI-4 Numerical MLE

Intro

In some situations it is not possible to derive the MLE using an analytical approach, so numerical optimization methods are required.
When taking a numerical approach, it is still a good idea to work with the log likelihood, as this function is usually better behaved. (The product in the likelihood function will often lead to a likelihood with either very small or very large values.)

Methods and Examples

Python includes general-purpose optimization capability; here we will focus on minimized() , a part of scipy, which minimizes a function specified by the user.
from scipy.optimize import minimize
For example, suppose is and is unkown. We have seen that the sample proportion is unbiased and has standard error of . The decide the value of p maximizing the standard error ()
import numpy as np def neg_se_of_phat(p, n=100): return (-1) * np.sqrt(p * (1-p) / n) n = 100 res = minimize(neg_se_of_phat, 0.2, method='Nelder-Mead', options={'disp': True}, args=(n))
  • The first argument is the function to be minimized.
  • The second argument is the starting values for the parameters over which the function will be minimized. Thie should be on the interior (not the boundary) of the space of possible values for the parameters. If there are more than one parameter, then this argument should be an array.
  • The method argument tells minimize which minimization technique to utilize.
  • The options argument can set various properties, including the maximum number of iterations, the tolerance, and, as in the example, the amount of information displayed.
res.x # array([0.5])

Numerical MLE for Gamma Dist

Suppose are i.i.d. .
def gamma_mle(x): from scipy.stats import gamma def neg_log_likelihood(pars, x): return (-1) * sum(gamma.logpdf(x, pars[0], scale=1/pars[1])) beta_hat_mom = np.mean(x) / (np.mean(x**2) - np.mean(x)**2) alpha_hat_mom = beta_hat_mom * np.mean(x) mle_out = minimize(neg_log_likelihood, [alpha_hat_mom, beta_hat_mom], args=(x), method='Nelder-Mead') return mle_out

Regression with t-distribution Errors

Consider the following regression model:
where are assumed to have the t-distribution with degrees of freedom. We will assume that is set by the user (not estimated).
There are three unknown parameters in this model: and . We want to estimate them via maximum likelihood.
We assume that we will observe pairs of values, and that the values are independent. To derive the density for a single observation.

Loading Comments...