Stability postulate

From HandWiki

In probability theory, to obtain a nondegenerate limiting distribution of the extreme value distribution, it is necessary to "reduce" the actual greatest value by applying a linear transformation with coefficients that depend on the sample size.

If [math]\displaystyle{ X_1, X_2, \dots , X_n }[/math] are independent random variables with common probability density function

[math]\displaystyle{ p_{X_j}(x)=f(x), }[/math]

then the cumulative distribution function of [math]\displaystyle{ X'_n=\max\{\,X_1,\ldots,X_n\,\} }[/math] is

[math]\displaystyle{ F_{X'_n}={[F(x)]}^n }[/math]

If there is a limiting distribution of interest, the stability postulate states the limiting distribution is some sequence of transformed "reduced" values, such as [math]\displaystyle{ (a_n X'_n + b_n) }[/math], where [math]\displaystyle{ a_n, b_n }[/math] may depend on n but not on x.

To distinguish the limiting cumulative distribution function from the "reduced" greatest value from F(x), we will denote it by G(x). It follows that G(x) must satisfy the functional equation

[math]\displaystyle{ {[G(x)]}^n = G{(a_n x + b_n)} }[/math]

This equation was obtained by Maurice René Fréchet and also by Ronald Fisher.

Boris Vladimirovich Gnedenko has shown there are no other distributions satisfying the stability postulate other than the following:

  • Gumbel distribution for the minimum stability postulate
    • If [math]\displaystyle{ X_i=\textrm{Gumbel}(\mu,\beta) }[/math] and [math]\displaystyle{ Y=\min\{\,X_1,\ldots,X_n\,\} }[/math] then [math]\displaystyle{ Y \sim a_n X+b_n }[/math] where [math]\displaystyle{ a_n=1 }[/math] and [math]\displaystyle{ b_n= \beta \log(n) }[/math]
    • In other words, [math]\displaystyle{ Y \sim \textrm{Gumbel}(\mu - \beta \log(n),\beta) }[/math]
  • Extreme value distribution for the maximum stability postulate
    • If [math]\displaystyle{ X_i=\textrm{EV}(\mu,\sigma) }[/math] and [math]\displaystyle{ Y=\max\{\,X_1,\ldots,X_n\,\} }[/math] then [math]\displaystyle{ Y \sim a_n X+b_n }[/math] where [math]\displaystyle{ a_n=1 }[/math] and [math]\displaystyle{ b_n= \sigma \log(\tfrac{1}{n}) }[/math]
    • In other words, [math]\displaystyle{ Y \sim \textrm{EV}(\mu - \sigma \log(\tfrac{1}{n}),\sigma) }[/math]
  • Fréchet distribution for the maximum stability postulate
    • If [math]\displaystyle{ X_i=\textrm{Frechet}(\alpha,s,m) }[/math] and [math]\displaystyle{ Y=\max\{\,X_1,\ldots,X_n\,\} }[/math] then [math]\displaystyle{ Y \sim a_n X+b_n }[/math] where [math]\displaystyle{ a_n=n^{-\tfrac{1}{\alpha}} }[/math] and [math]\displaystyle{ b_n= m \left( 1- n^{-\tfrac{1}{\alpha}}\right) }[/math]
    • In other words, [math]\displaystyle{ Y \sim \textrm{Frechet}(\alpha,n^{\tfrac{1}{\alpha}} s,m) }[/math]