Jackknife

From HandWiki


The jackknife is a method in statistics allowing one to judge the uncertainties of estimators derived from small samples, without assumptions about the underlying probability distributions. The method consists of forming new samples by omitting, in turn, one of the observations of the original sample. For each of the samples thus generated, the estimator under study can be calculated, and the probability distribution thus obtained will allow one to draw conclusions about the estimator's sensitivity to individual observations. A competitive, perhaps more powerful method is the bootstrap; details can be found in Efron79 or Efron82.