Multivariate analysis of covariance

From HandWiki

Multivariate analysis of covariance (MANCOVA) is an extension of analysis of covariance (ANCOVA) methods to cover cases where there is more than one dependent variable and where the control of concomitant continuous independent variables – covariates – is required. The most prominent benefit of the MANCOVA design over the simple MANOVA is the 'factoring out' of noise or error that has been introduced by the covariant.[1] A commonly used multivariate version of the ANOVA F-statistic is Wilks' Lambda (Λ), which represents the ratio between the error variance (or covariance) and the effect variance (or covariance).[1]

Goals

Similarly to all tests in the ANOVA family, the primary aim of the MANCOVA is to test for significant differences between group means.[1] The process of characterising a covariate in a data source allows the reduction of the magnitude of the error term, represented in the MANCOVA design as MSerror. Subsequently, the overall Wilks' Lambda will become larger and more likely to be characterised as significant.[1] This grants the researcher more statistical power to detect differences within the data. The multivariate aspect of the MANCOVA allows the characterisation of differences in group means in regards to a linear combination of multiple dependent variables, while simultaneously controlling for covariates.

Example situation where MANCOVA is appropriate: Suppose a scientist is interested in testing two new drugs for their effects on depression and anxiety scores. Also suppose that the scientist has information pertaining to the overall responsivity to drugs for each patient; accounting for this covariate will grant the test higher sensitivity in determining the effects of each drug on both dependent variables.

Assumptions

Certain assumptions must be met for the MANCOVA to be used appropriately:

  1. Normality: For each group, each dependent variable must represent a normal distribution of scores. Furthermore, any linear combination of dependent variables must be normally distributed. Transformation or removal of outliers can help ensure this assumption is met.[2] Violation of this assumption may lead to an increase in Type I error rates.[3]
  2. Independence of observations: Each observation must be independent of all other observations; this assumption can be met by employing random sampling techniques. Violation of this assumption may lead to an increase in Type I error rates.[3]
  3. Homogeneity of variances: Each dependent variable must demonstrate similar levels of variance across each independent variable. Violation of this assumption can be conceptualised as a correlation existing between the variances and the means of dependent variables. This violation is often called 'heteroscedasticity'[4] and can be tested for using Levene's test.[5]
  4. Homogeneity of covariances: The intercorrelation matrix between dependent variables must be equal across all levels of the independent variable. Violation of this assumption may lead to an increase in Type I error rates as well as decreased statistical power.[3]

Logic of MANOVA

Inspired by ANOVA, MANOVA is based on a generalization of sum of squares explained by the model [math]\displaystyle{ S_\text{model} }[/math] and the inverse of the sum of squares unexplained by the model [math]\displaystyle{ S_\text{res}^{-1} }[/math]. The most common[6][7] statistics are summaries based on the roots (or eigenvalues) [math]\displaystyle{ \lambda_p }[/math] of the matrix [math]\displaystyle{ A:= S_{\text{model}}S_{\text{res}}^{-1} }[/math].

  • Samuel Stanley Wilks' [math]\displaystyle{ \Lambda_\text{Wilks} = \prod_{1,\ldots,p}(1/(1 + \lambda_{p})) = \det(I + A)^{-1} = \det(S_\text{res})/\det(S_\text{res} + S_\text{model}) }[/math] distributed as lambda (Λ)
  • the K. C. Sreedharan PillaiM. S. Bartlett trace, [math]\displaystyle{ \Lambda_\text{Pillai} = \sum_{1,\ldots,p}(\lambda_p/(1 + \lambda_p)) = \operatorname{tr}(A(I + A)^{-1}) }[/math]
  • the Lawley–Hotelling trace, [math]\displaystyle{ \Lambda_\text{LH} = \sum_{1,\ldots,p}(\lambda_{p}) = \operatorname{tr}(A) }[/math]
  • Roy's greatest root (also called Roy's largest root), [math]\displaystyle{ \Lambda_\text{Roy} = \max_p(\lambda_p) }[/math]

Covariates

In statistics, a covariate represents a source of variation that has not been controlled in the experiment and is believed to affect the dependent variable.[8] The aim of such techniques as ANCOVA is to remove the effects of such uncontrolled variation, in order to increase statistical power and to ensure an accurate measurement of the true relationship between independent and dependent variables.[8]

An example is provided by the analysis of trend in sea-level by Woodworth (1987). Here the dependent variable (and variable of most interest) was the annual mean sea level at a given location for which a series of yearly values were available. The primary independent variable was "time". Use was made of a "covariate" consisting of yearly values of annual mean atmospheric pressure at sea level. The results showed that inclusion of the covariate allowed improved estimates of the trend against time to be obtained, compared to analyses which omitted the covariate.

See also

  • Discriminant function analysis
  • ANCOVA
  • MANOVA

References

  1. 1.0 1.1 1.2 1.3 [1] Statsoft Textbook, ANOVA/MANOVA.
  2. [2] French, A. et al., 2010. Multivariate analysis of variance (MANOVA).
  3. 3.0 3.1 3.2 [3] Davis, K., 2003. Multiple analysis of variance (MANOVA) or multiple analysis of covariance (MANCOVA). Louisiana State University.
  4. [4] Bors, D. A. University of Toronto at Scarborough.
  5. [5] McLaughlin, M., 2009. University of Southern Carolina.
  6. Garson, G. David. "Multivariate GLM, MANOVA, and MANCOVA". http://faculty.chass.ncsu.edu/garson/PA765/manova.htm. 
  7. UCLA: Academic Technology Services, Statistical Consulting Group.. "Stata Annotated Output – MANOVA". http://www.ats.ucla.edu/stat/stata/output/Stata_MANOVA.htm. 
  8. 8.0 8.1 Kirk, Roger E. (1982). Experimental design (2nd ed.). Monterey, Calif.: Brooks/Cole Pub. Co.. ISBN 0-8185-0286-X. https://archive.org/details/experimentaldesi0000kirk.