
Maximum Likelihood Estimation (MLE)
Maximum Likelihood Estimation (MLE) is a statistical method used to estimate the parameters of a probabilistic model such that the observed data is most probable under the model. This approach works by maximizing a likelihood function, which quantifies how likely it is to observe the given data for various parameter values. MLE is widely applicable across different fields, from economics and biology to machine learning and engineering, as it provides a framework for making inferences about populations based on sample data. By seeking the parameter values that make the observed data most likely, researchers can infer characteristics of the underlying process that generated the data, thus enabling more accurate predictions and better decision-making in uncertain environments.
Principle of MLE


Suppose we have a dataset generated from a probability distribution with density function
, where
is the parameter to be estimated.
MLE finds the value of that maximizes the likelihood function:
We usually work with the log-likelihood for simplicity:
Steps to Find MLE
- Write the likelihood function
- Take the log to get
- Compute the derivative
and solve
to find
- Check second-order condition:
: maximum
: minimum
: check higher-order derivatives
Example 1: Gaussian Distribution 
Let . The PDF is:
Log-likelihood:
Solving:
Example 2: Bernoulli Distribution
Suppose . PMF:
Likelihood:
Let , then:
Derivative:
Solving:
Example 3: Poisson Distribution
Assume . PMF:
Likelihood:
Log-likelihood (ignoring constants):
Derivative:
Solving:
Summary Table of MLEs
Distribution | PMF/PDF | MLE |
---|---|---|
Normal | ||
Bernoulli | ||
Poisson |
Discover more from Science Comics
Subscribe to get the latest posts sent to your email.