Working–Hotelling procedure

From HandWiki
Short description: Method of simultaneous inference

In statistics, particularly regression analysis, the Working–Hotelling procedure, named after Holbrook Working and Harold Hotelling, is a method of simultaneous estimation in linear regression models. One of the first developments in simultaneous inference, it was devised by Working and Hotelling for the simple linear regression model in 1929.[1] It provides a confidence region for multiple mean responses, that is, it gives the upper and lower bounds of more than one value of a dependent variable at several levels of the independent variables at a certain confidence level. The resulting confidence bands are known as the Working–Hotelling–Scheffé confidence bands.

Like the closely related Scheffé's method in the analysis of variance, which considers all possible contrasts, the Working–Hotelling procedure considers all possible values of the independent variables; that is, in a particular regression model, the probability that all the Working–Hotelling confidence intervals cover the true value of the mean response is the confidence coefficient. As such, when only a small subset of the possible values of the independent variable is considered, it is more conservative and yields wider intervals than competitors like the Bonferroni correction at the same level of confidence. It outperforms the Bonferroni correction as more values are considered.

Statement

Simple linear regression

Consider a simple linear regression model [math]\displaystyle{ Y = \beta_0 + \beta_1 X + \varepsilon }[/math], where [math]\displaystyle{ Y }[/math] is the response variable and [math]\displaystyle{ X }[/math] the explanatory variable, and let [math]\displaystyle{ b_0 }[/math] and [math]\displaystyle{ b_1 }[/math] be the least-squares estimates of [math]\displaystyle{ \beta_0 }[/math] and [math]\displaystyle{ \beta_1 }[/math] respectively. Then the least-squares estimate of the mean response [math]\displaystyle{ E(Y_i) }[/math] at the level [math]\displaystyle{ X = x_i }[/math] is [math]\displaystyle{ \hat{Y_i} = b_0 + b_1 x_i }[/math]. It can then be shown, assuming that the errors independently and identically follow the normal distribution, that an [math]\displaystyle{ 1 - \alpha }[/math] confidence interval of the mean response at a certain level of [math]\displaystyle{ X }[/math] is as follows:

[math]\displaystyle{ \hat{y}_i \in \left[ b_0 + b_1 x_i \pm t_{\alpha/2,\text{df}=n - 2} \sqrt{ \left(\frac{1}{n - 2} \sum_{j=1}^n e_j^{\,2} \right) \cdot \left(\frac{1}{n} + \frac{(x_i - \bar{x})^2}{\sum_{j=1}^n (x_j - \bar{x})^2}\right)}\right], }[/math]

where [math]\displaystyle{ \left(\frac{1}{n - 2} \sum_{j=1}^n e_j^{\,2} \right) }[/math] is the mean squared error and [math]\displaystyle{ t_{\alpha/2,\text{df}=n - 2} }[/math] denotes the upper [math]\displaystyle{ \frac{\alpha}{2}^\text{th} }[/math] percentile of Student's t-distribution with [math]\displaystyle{ n-2 }[/math] degrees of freedom.

However, as multiple mean responses are estimated, the confidence level declines rapidly. To fix the confidence coefficient at [math]\displaystyle{ 1 - \alpha }[/math], the Working–Hotelling approach employs an F-statistic:[2][3]

[math]\displaystyle{ \hat{y}_i \in \left[ b_0 + b_1 x_i \pm W \sqrt{ \left(\frac{1}{n - 2} \sum_{j=1}^n e_j^{\,2} \right) \cdot \left(\frac{1}{n} + \frac{(x_i - \bar{x})^2}{\sum_{j=1}^n(x_j - \bar{x})^2}\right)}\right], }[/math]

where [math]\displaystyle{ W^2 = 2F_{\alpha,\text{df}=(2,n-2)} }[/math] and [math]\displaystyle{ F }[/math] denotes the upper [math]\displaystyle{ \alpha^\text{th} }[/math] percentile of the F-distribution with [math]\displaystyle{ (2, n-2) }[/math] degrees of freedom. The confidence level of is [math]\displaystyle{ 1 - \alpha }[/math] over all values of [math]\displaystyle{ X }[/math], i.e. [math]\displaystyle{ x_i \in \mathbb{R} }[/math].

Multiple linear regression

The Working–Hotelling confidence bands can be easily generalised to multiple linear regression. Consider a general linear model as defined in the linear regressions article, that is,

[math]\displaystyle{ \mathbf{Y} = \mathbf{X}\boldsymbol\beta + \boldsymbol\varepsilon, \, }[/math]

where

[math]\displaystyle{ \mathbf{Y} = \begin{pmatrix} Y_1 \\ Y_2 \\ \vdots \\ Y_n \end{pmatrix}, \quad \mathbf{X} = \begin{pmatrix} \mathbf{x}^{\rm T}_1 \\ \mathbf{x}^{\rm T}_2 \\ \vdots \\ \mathbf{x}^{\rm T}_n \end{pmatrix} = \begin{pmatrix} x_{11} & \cdots & x_{1p} \\ x_{21} & \cdots & x_{2p} \\ \vdots & \ddots & \vdots \\ x_{n1} & \cdots & x_{np} \end{pmatrix}, \boldsymbol\beta = \begin{pmatrix} \beta_1 \\ \beta_2 \\ \vdots \\ \beta_p \end{pmatrix}, \quad \boldsymbol\varepsilon = \begin{pmatrix} \varepsilon_1 \\ \varepsilon_2 \\ \vdots \\ \varepsilon_n \end{pmatrix}. }[/math]

Again, it can be shown that the least-squares estimate of the mean response [math]\displaystyle{ E(Y_i) = \mathbf{x}^{\rm T}_i \boldsymbol\beta }[/math] is [math]\displaystyle{ \hat{Y}_i = \mathbf{x}^{\rm T}_i \mathbf{b} }[/math], where [math]\displaystyle{ \mathbf{b} }[/math] consists of least-square estimates of the entries in [math]\displaystyle{ \boldsymbol\beta }[/math], i.e. [math]\displaystyle{ \mathbf{b} = (\mathbf{X}^{\rm T} \mathbf{X})^{-1} \mathbf{X}^{\rm T}\mathbf{Y} }[/math]. Likewise, it can be shown that a [math]\displaystyle{ 1 - \alpha }[/math] confidence interval for a single mean response estimate is as follows:[4]

[math]\displaystyle{ \hat{y}_i \in \left[ \mathbf{x}^{\rm T}_i \mathbf{b} \pm t_{\alpha/2,\text{df}=n - p} \sqrt{\operatorname{MSE}(\mathbf{x}^{\rm T}_i (\mathbf{X}^{\rm T}\mathbf{X})^{-1} \mathbf{x}_i})\right], }[/math]

where [math]\displaystyle{ \operatorname{MSE} }[/math] is the observed value of the mean squared error [math]\displaystyle{ (Y^{\rm T} Y - \mathbf{b}^{\rm T} X^{\rm T} Y) }[/math].

The Working–Hotelling approach to multiple estimations is similar to that of simple linear regression, with only a change in the degrees of freedom:[3]

[math]\displaystyle{ \hat{y}_i \in \left[ \mathbf{x}^{\rm T}_i \mathbf{b} \pm W \sqrt{\operatorname{MSE}(\mathbf{x}^{\rm T}_i (\mathbf{X}^{\rm T}\mathbf{X})^{-1} \mathbf{x}_i})\right], }[/math]

where [math]\displaystyle{ W^2 = 2F_{\alpha,\text{df}=(p,n-p)} }[/math].

Graphical representation

In the simple linear regression case, Working–Hotelling–Scheffé confidence bands, drawn by connecting the upper and lower limits of the mean response at every level, take the shape of hyperbolas. In drawing, they are sometimes approximated by the Graybill–Bowden confidence bands, which are linear and hence easier to graph:[2]

[math]\displaystyle{ \beta_0 + \beta_1(x_i-\bar{x}) \in \left[ b_0 + b_1(x_i-\bar{x}) \pm m_{\alpha, 2, \text{df}=n-2} \cdot \left(\frac{1}{\sqrt n} + \frac{|x_i-\bar x|}{\sqrt{\sum_{j=1}^n (x_j-\bar x)}} \right) \right] }[/math]

where [math]\displaystyle{ m_{\alpha, 2, \text{df}=n-2} }[/math]denotes the upper [math]\displaystyle{ \alpha^\text{th} }[/math] percentile of the Studentized maximum modulus distribution with two means and [math]\displaystyle{ n - 2 }[/math] degrees of freedom.

The simple linear regression model with a Working–Hotelling confidence band.

Numerical example

The same data in ordinary least squares are utilised in this example:

Height (m) 1.47 1.50 1.52 1.55 1.57 1.60 1.63 1.65 1.68 1.70 1.73 1.75 1.78 1.80 1.83
Weight (kg) 52.21 53.12 54.48 55.84 57.20 58.57 59.93 61.29 63.11 64.47 66.28 68.10 69.92 72.19 74.46

A simple linear regression model is fit to this data. The values of [math]\displaystyle{ b_0 }[/math] and [math]\displaystyle{ b_1 }[/math] have been found to be −39.06 and 61.27 respectively. The goal is to estimate the mean mass of women given their heights at the 95% confidence level. The value of [math]\displaystyle{ W^2 }[/math] was found to be [math]\displaystyle{ F_{0.95, \text{df}=(2,15-2)} = 2.758828 }[/math]. It was also found that [math]\displaystyle{ \bar{x} = 1.651 }[/math], [math]\displaystyle{ \sum_{j=1}^n e_j^{\,2}= 7.490558 }[/math], [math]\displaystyle{ \operatorname{MSE} = 0.5761968 }[/math] and [math]\displaystyle{ \sum_{j=1}^n (x_j - \bar{x})^2 = 693.3726 }[/math]. Then, to predict the mean mass of all women of a particular height, the following Working–Hotelling–Scheffé band has been derived:

[math]\displaystyle{ \hat{y}_i \in \left[ -39.06 + 61.27 x_i \pm \sqrt{ 2.758828 \cdot 0.5761968 \cdot \left(\frac{1}{15} + \frac{(x_i - 1.651)^2}{693.3726} \right)}\right], }[/math]

which results in the graph on the left.

Comparison with other methods

Bonferroni bands for the same linear regression model, based on estimating the response variable given the observed values of X. The confidence bands are noticeably tighter.

The Working–Hotelling approach may give tighter or looser confidence limits compared to the Bonferroni correction. In general, for small families of statements, the Bonferroni bounds may be tighter, but when the number of estimated values increases, the Working–Hotelling procedure will yield narrower limits. This is because the confidence level of Working–Hotelling–Scheffé bounds is exactly [math]\displaystyle{ 1 - \alpha }[/math] when all values of the independent variables, i.e. [math]\displaystyle{ x_i \in \mathbb{R} }[/math], are considered. Alternatively, from an algebraic perspective, the critical value [math]\displaystyle{ \pm \sqrt{W} }[/math] remains constant as the number estimates of increases, whereas the corresponding values in Bonferonni estimates, [math]\displaystyle{ \pm t_{1-\alpha/g, \text{df}=n-p} }[/math], will be increasingly divergent as the number [math]\displaystyle{ g }[/math] of estimates increases. Therefore, the Working–Hotelling method is more suited for large-scale comparisons, whereas Bonferroni is preferred if only a few mean responses are to be estimated. In practice, both methods are usually used first and the narrower interval chosen.[4]

Another alternative to the Working–Hotelling–Scheffé band is the Gavarian band, which is used when a confidence band is needed that maintains equal widths at all levels.[5]

The Working–Hotelling procedure is based on the same principles as Scheffé's method, which gives family confidence intervals for all possible contrasts.[6] Their proofs are almost identical.[5] This is because both methods estimate linear combinations of mean response at all factor levels. However, the Working–Hotelling procedure does not deal with contrasts but with different levels of the independent variable, so there is no requirement that the coefficients of the parameters sum up to zero. Therefore, it has one more degree of freedom.[6]

See also

  • Multiple comparisons

Footnotes

  1. Miller (1966), p. 1
  2. 2.0 2.1 Miller (2014)
  3. 3.0 3.1 Neter, Wasserman and Kutner, pp. 163–165
  4. 4.0 4.1 Neter, Wasserman and Kutner, pp. 244–245
  5. 5.0 5.1 Miller (1966), pp. 123–127
  6. 6.0 6.1 Westfall, Tobias and Wolfinger, pp. 277–280

Bibliography