Extensions of Fisher's method
In statistics, extensions of Fisher's method are a group of approaches that allow approximately valid statistical inferences to be made when the assumptions required for the direct application of Fisher's method are not valid. Fisher's method is a way of combining the information in the p-values from different statistical tests so as to form a single overall test: this method requires that the individual test statistics (or, more immediately, their resulting p-values) should be statistically independent.
Dependent statistics
A principal limitation of Fisher's method is its exclusive design to combine independent p-values, which renders it an unreliable technique to combine dependent p-values. To overcome this limitation, a number of methods were developed to extend its utility.
Known covariance
Brown's method
Fisher's method showed that the log-sum of k independent p-values follow a χ2-distribution with 2k degrees of freedom:[1][2]
- [math]\displaystyle{ X = -2\sum_{i=1}^k \log_e(p_i) \sim \chi^2(2k) . }[/math]
In the case that these p-values are not independent, Brown proposed the idea of approximating X using a scaled χ2-distribution, cχ2(k’), with k’ degrees of freedom.
The mean and variance of this scaled χ2 variable are:
- [math]\displaystyle{ \operatorname{E}[c\chi^2(k')] = ck' , }[/math]
- [math]\displaystyle{ \operatorname{Var}[c\chi^2(k')] = 2c^2k' . }[/math]
where [math]\displaystyle{ c=\operatorname{Var}(X)/(2\operatorname{E}[X]) }[/math] and [math]\displaystyle{ k'=2(\operatorname{E}[X])^2/\operatorname{Var}(X) }[/math]. This approximation is shown to be accurate up to two moments.
Unknown covariance
Harmonic mean p-value
The harmonic mean p-value offers an alternative to Fisher's method for combining p-values when the dependency structure is unknown but the tests cannot be assumed to be independent.[3][4]
Kost's method: t approximation
This method requires the test statistics' covariance structure to be known up to a scalar multiplicative constant.[2]
Cauchy combination test
This is conceptually similar to Fisher's method: it computes a sum of transformed p-values. Unlike Fisher's method, which uses a log transformation to obtain a test statistic which has a chi-squared distribution under the null, the Cauchy combination test uses a tan transformation to obtain a test statistic whose tail is asymptotic to that of a Cauchy distribution under the null. The test statistic is:
- [math]\displaystyle{ X = \sum_{i=1}^k \omega_i \tan[(0.5-p_i)\pi] , }[/math]
where [math]\displaystyle{ \omega_i }[/math] are non-negative weights, subject to [math]\displaystyle{ \sum_{i=1}^k \omega_i = 1 }[/math]. Under the null, [math]\displaystyle{ p_i }[/math] are uniformly distributed, therefore [math]\displaystyle{ \tan[(0.5-p_i)\pi] }[/math] are Cauchy distributed. Under some mild assumptions, but allowing for arbitrary dependency between the [math]\displaystyle{ p_i }[/math], the tail of the distribution of X is asymptotic to that of a Cauchy distribution. More precisely, letting W denote a standard Cauchy random variable:
- [math]\displaystyle{ \lim_{t \to \infty} \frac{P[X \gt t]}{P[W \gt t]} = 1. }[/math]
This leads to a combined hypothesis test, in which X is compared to the quantiles of the Cauchy distribution.[5]
References
- ↑ Brown, M. (1975). "A method for combining non-independent, one-sided tests of significance". Biometrics 31 (4): 987–992. doi:10.2307/2529826.
- ↑ 2.0 2.1 Kost, J.; McDermott, M. (2002). "Combining dependent P-values". Statistics & Probability Letters 60 (2): 183–190. doi:10.1016/S0167-7152(02)00310-3.
- ↑ "Significance tests in parallel and in series". Journal of the American Statistical Association 53 (284): 799–813. 1958. doi:10.1080/01621459.1958.10501480.
- ↑ "The harmonic mean p-value for combining dependent tests". Proceedings of the National Academy of Sciences USA 116 (4): 1195–1200. 2019. doi:10.1073/pnas.1814092116. PMID 30610179.
- ↑ "Cauchy combination test: a powerful test with analytic p-value calculation under arbitrary dependency structures". Journal of the American Statistical Association 115 (529): 393-402. 2020. doi:10.1080/01621459.2018.1554485.
Original source: https://en.wikipedia.org/wiki/Extensions of Fisher's method.
Read more |