Limit Theorems and Estimation
Overview
This section covers the fundamental limit theorems of probability theory and statistical estimation methods.
Key Definitions
For a sample of observations , the sample mean is:
If the observations are i.i.d. with mean and variance , then:
- (unbiased)
The Law of Large Numbers states that as the sample size increases, the sample mean converges to the population mean.
For i.i.d. random variables with mean and finite variance, the sample mean converges to as .
- Weak LLN: Convergence in probability: as
- Strong LLN: Almost sure convergence:
- Applications: Monte Carlo simulations, statistical inference
The Central Limit Theorem states that the distribution of the sample mean approaches a normal distribution as the sample size increases, regardless of the population distribution (under certain conditions).
For i.i.d. random variables with mean and variance :
Equivalently: for large .
- Standard Error:
- Applications: Constructing confidence intervals, hypothesis testing
For a sample from a distribution with parameter , the likelihood function is:
where is the PMF or PDF. The likelihood measures how likely the observed data is for different parameter values.
The Maximum Likelihood Estimator (MLE) is the parameter value that maximizes the likelihood function:
In practice, we often maximize the log-likelihood:
To find the MLE, solve:
An estimator is unbiased for parameter if:
The sample mean is an unbiased estimator of the population mean .
Examples of MLEs
- Bernoulli(): (sample proportion)
- Poisson(): (sample mean)
- Normal(): ,