In statistics, an adaptive estimator is an estimator in a parametric or semiparametric model with nuisance parameters such that the presence of these nuisance parameters does not affect efficiency of estimation.

## Definition

Formally, let parameter θ in a parametric model consists of two parts: the parameter of interest νNRk, and the nuisance parameter ηHRm. Thus θ = (ν,η) ∈ N×HRk+m. Then we will say that $\displaystyle{ \scriptstyle\hat\nu_n }$ is an adaptive estimator of ν in the presence of η if this estimator is regular, and efficient for each of the submodels[1]

$\displaystyle{ \mathcal{P}_\nu(\eta_0) = \big\{ P_\theta: \nu\in N,\, \eta=\eta_0\big\}. }$

Adaptive estimator estimates the parameter of interest equally well regardless whether the value of the nuisance parameter is known or not.

The necessary condition for a regular parametric model to have an adaptive estimator is that

$\displaystyle{ I_{\nu\eta}(\theta) = \operatorname{E}[\, z_\nu z_\eta' \,] = 0 \quad \text{for all }\theta, }$

where zν and zη are components of the score function corresponding to parameters ν and η respectively, and thus Iνη is the top-right k×m block of the Fisher information matrix I(θ).

## Example

Suppose $\displaystyle{ \scriptstyle\mathcal{P} }$ is the normal location-scale family:

$\displaystyle{ \mathcal{P} = \Big\{\ f_\theta(x) = \tfrac{1}{\sqrt{2\pi}\sigma} e^{ -\frac{1}{2\sigma^2}(x-\mu)^2 }\ \Big|\ \mu\in\mathbb{R}, \sigma\gt 0 \ \Big\}. }$

Then the usual estimator $\displaystyle{ \hat\mu\,=\,\bar{x} }$ is adaptive: we can estimate the mean equally well whether we know the variance or not.

## Notes

1. Bickel 1998, Definition 2.4.1

## Basic references

• Bickel, Peter J.; Chris A.J. Klaassen; Ya’acov Ritov; Jon A. Wellner (1998). Efficient and adaptive estimation for semiparametric models. Springer: New York. ISBN 978-0-387-98473-5.