Fraction of variance unexplained

From HandWiki
Revision as of 16:39, 6 February 2024 by HamTop (talk | contribs) (fixing)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Statistical noise


In statistics, the fraction of variance unexplained (FVU) in the context of a regression task is the fraction of variance of the regressand (dependent variable) Y which cannot be explained, i.e., which is not correctly predicted, by the explanatory variables X.

Formal definition

Suppose we are given a regression function [math]\displaystyle{ f }[/math] yielding for each [math]\displaystyle{ y_i }[/math] an estimate [math]\displaystyle{ \widehat{y}_i = f(x_i) }[/math] where [math]\displaystyle{ x_i }[/math] is the vector of the ith observations on all the explanatory variables.[1]:181 We define the fraction of variance unexplained (FVU) as:

[math]\displaystyle{ \begin{align} \text{FVU} & = {\text{VAR}_\text{err} \over \text{VAR}_\text{tot}} = {\text{SS}_\text{err}/N \over \text{SS}_\text{tot}/N} = {\text{SS}_\text{err} \over \text{SS}_\text{tot}} \left( = 1-{\text{SS}_\text{reg} \over \text{SS}_\text{tot}} , \text{ only true in some cases such as linear regression}\right) \\[6pt] & = 1 - R^2 \end{align} }[/math]

where R2 is the coefficient of determination and VARerr and VARtot are the variance of the residuals and the sample variance of the dependent variable. SSerr (the sum of squared predictions errors, equivalently the residual sum of squares), SStot (the total sum of squares), and SSreg (the sum of squares of the regression, equivalently the explained sum of squares) are given by

[math]\displaystyle{ \begin{align} \text{SS}_\text{err} & = \sum_{i=1}^N\;(y_i - \widehat{y}_i)^2\\ \text{SS}_\text{tot} & = \sum_{i=1}^N\;(y_i-\bar{y})^2 \\ \text{SS}_\text{reg} & = \sum_{i=1}^N\;(\widehat{y}_i-\bar{y})^2 \text{ and} \\ \bar{y} & = \frac 1 N \sum_{i=1}^N\;y_i. \end{align} }[/math]

Alternatively, the fraction of variance unexplained can be defined as follows:

[math]\displaystyle{ \text{FVU} = \frac{\operatorname{MSE}(f)}{\operatorname{var}[Y]} }[/math]

where MSE(f) is the mean squared error of the regression function ƒ.

Explanation

It is useful to consider the second definition to understand FVU. When trying to predict Y, the most naive regression function that we can think of is the constant function predicting the mean of Y, i.e., [math]\displaystyle{ f(x_i)=\bar{y} }[/math]. It follows that the MSE of this function equals the variance of Y; that is, SSerr = SStot, and SSreg = 0. In this case, no variation in Y can be accounted for, and the FVU then has its maximum value of 1.

More generally, the FVU will be 1 if the explanatory variables X tell us nothing about Y in the sense that the predicted values of Y do not covary with Y. But as prediction gets better and the MSE can be reduced, the FVU goes down. In the case of perfect prediction where [math]\displaystyle{ \hat{y}_i = y_i }[/math] for all i, the MSE is 0, SSerr = 0, SSreg = SStot, and the FVU is 0.

See also

References

  1. Achen, C. H. (1990). "'What Does "Explained Variance" Explain?: Reply". Political Analysis 2 (1): 173–184. doi:10.1093/pan/2.1.173.