Inverse Gaussian distribution
Probability density function ![]() | |||
Cumulative distribution function ![]() | |||
Notation | [math]\displaystyle{ \operatorname{IG}\left(\mu, \lambda\right) }[/math] | ||
---|---|---|---|
Parameters |
[math]\displaystyle{ \mu \gt 0 }[/math] [math]\displaystyle{ \lambda \gt 0 }[/math] | ||
Support | [math]\displaystyle{ x \in (0,\infty) }[/math] | ||
[math]\displaystyle{ \sqrt\frac{\lambda}{2 \pi x^3} \exp\left[-\frac{\lambda (x-\mu)^2}{2 \mu^2 x}\right] }[/math] | |||
CDF |
[math]\displaystyle{ \Phi\left(\sqrt{\frac{\lambda}{x}} \left(\frac{x}{\mu}-1 \right)\right) }[/math] [math]\displaystyle{ {}+\exp\left(\frac{2 \lambda}{\mu}\right) \Phi\left(-\sqrt{\frac{\lambda}{x}}\left(\frac{x}{\mu}+1 \right)\right) }[/math] where [math]\displaystyle{ \Phi }[/math] is the standard normal (standard Gaussian) distribution c.d.f. | ||
Mean |
[math]\displaystyle{ \operatorname{E}[X] = \mu }[/math] | ||
Mode | [math]\displaystyle{ \mu\left[\left(1+\frac{9 \mu^2}{4 \lambda^2}\right)^\frac{1}{2}-\frac{3 \mu}{2 \lambda}\right] }[/math] | ||
Variance |
[math]\displaystyle{ \operatorname{Var}[ X] = \frac{\mu^3}{\lambda} }[/math] | ||
Skewness | [math]\displaystyle{ 3\left(\frac{\mu}{\lambda}\right)^{1/2} }[/math] | ||
Kurtosis | [math]\displaystyle{ \frac{15 \mu}{\lambda} }[/math] | ||
MGF | [math]\displaystyle{ \exp\left[{\frac{\lambda}{\mu}\left(1-\sqrt{1-\frac{2\mu^2t}{\lambda}}\right)}\right] }[/math] | ||
CF | [math]\displaystyle{ \exp\left[{\frac{\lambda}{\mu}\left(1-\sqrt{1-\frac{2\mu^2\mathrm{i}t}{\lambda}}\right)}\right] }[/math] |
In probability theory, the inverse Gaussian distribution (also known as the Wald distribution) is a two-parameter family of continuous probability distributions with support on (0,∞).
Its probability density function is given by
- [math]\displaystyle{ f(x;\mu,\lambda) = \sqrt\frac{\lambda}{2 \pi x^3} \exp\biggl(-\frac{\lambda (x-\mu)^2}{2 \mu^2 x}\biggr) }[/math]
for x > 0, where [math]\displaystyle{ \mu \gt 0 }[/math] is the mean and [math]\displaystyle{ \lambda \gt 0 }[/math] is the shape parameter.[1]
The inverse Gaussian distribution has several properties analogous to a Gaussian distribution. The name can be misleading: it is an "inverse" only in that, while the Gaussian describes a Brownian motion's level at a fixed time, the inverse Gaussian describes the distribution of the time a Brownian motion with positive drift takes to reach a fixed positive level.
Its cumulant generating function (logarithm of the characteristic function) is the inverse of the cumulant generating function of a Gaussian random variable.
To indicate that a random variable X is inverse Gaussian-distributed with mean μ and shape parameter λ we write [math]\displaystyle{ X \sim \operatorname{IG}(\mu, \lambda)\,\! }[/math].
Properties
Single parameter form
The probability density function (pdf) of the inverse Gaussian distribution has a single parameter form given by
- [math]\displaystyle{ f(x;\mu,\mu^2) = \frac{\mu}{\sqrt{2 \pi x^3}} \exp\biggl(-\frac{(x-\mu)^2}{2x}\biggr). }[/math]
In this form, the mean and variance of the distribution are equal, [math]\displaystyle{ \mathbb{E}[X] = \text{Var}(X). }[/math]
Also, the cumulative distribution function (cdf) of the single parameter inverse Gaussian distribution is related to the standard normal distribution by
- [math]\displaystyle{ \begin{align} \Pr(X \lt x) &= \Phi(-z_1) + e^{2 \mu} \Phi(-z_2), \end{align} }[/math]
where [math]\displaystyle{ z_1 = \frac{\mu}{x^{1/2}} - x^{1/2} }[/math], [math]\displaystyle{ z_2 = \frac{\mu}{x^{1/2}} + x^{1/2}, }[/math] and the [math]\displaystyle{ \Phi }[/math] is the cdf of standard normal distribution. The variables [math]\displaystyle{ z_1 }[/math] and [math]\displaystyle{ z_2 }[/math] are related to each other by the identity [math]\displaystyle{ z_2^2 = z_1^2 + 4\mu. }[/math]
In the single parameter form, the MGF simplifies to
- [math]\displaystyle{ M(t) = \exp[\mu(1-\sqrt{1-2 t})]. }[/math]
An inverse Gaussian distribution in double parameter form [math]\displaystyle{ f(x;\mu,\lambda) }[/math] can be transformed into a single parameter form [math]\displaystyle{ f(y;\mu_0,\mu_0^2) }[/math] by appropriate scaling [math]\displaystyle{ y = \frac{\mu^2 x}{\lambda}, }[/math] where [math]\displaystyle{ \mu_0 = \mu^3/\lambda. }[/math]
The standard form of inverse Gaussian distribution is
- [math]\displaystyle{ f(x;1,1) = \frac{1}{\sqrt{2 \pi x^3}} \exp\biggl(-\frac{(x-1)^2}{2x}\biggr). }[/math]
Summation
If Xi has an [math]\displaystyle{ \operatorname{IG}(\mu_0 w_i, \lambda_0 w_i^2 )\,\! }[/math] distribution for i = 1, 2, ..., n and all Xi are independent, then
- [math]\displaystyle{ S=\sum_{i=1}^n X_i \sim \operatorname{IG}\left( \mu_0 \sum w_i, \lambda_0 \left(\sum w_i \right)^2 \right). }[/math]
Note that
- [math]\displaystyle{ \frac{\operatorname{Var}(X_i)}{\operatorname{E}(X_i)}= \frac{\mu_0^2 w_i^2 }{\lambda_0 w_i^2} =\frac{\mu_0^2}{\lambda_0} }[/math]
is constant for all i. This is a necessary condition for the summation. Otherwise S would not be Inverse Gaussian distributed.
Scaling
For any t > 0 it holds that
- [math]\displaystyle{ X \sim \operatorname{IG}(\mu,\lambda) \,\,\,\,\,\, \Rightarrow \,\,\,\,\,\, tX \sim \operatorname{IG}(t\mu,t\lambda). }[/math]
Exponential family
The inverse Gaussian distribution is a two-parameter exponential family with natural parameters −λ/(2μ2) and −λ/2, and natural statistics X and 1/X.
For [math]\displaystyle{ \lambda\gt 0 }[/math] fixed, it is also a single-parameter natural exponential family distribution[2] where the base distribution has density
- [math]\displaystyle{ h(x) = \sqrt{ \frac{\lambda}{2\pi x^3} } \exp\left( - \frac{\lambda}{2x} \right) \mathbb{1}_{[0,\infty)}(x)\,. }[/math]
Indeed, with [math]\displaystyle{ \theta\le 0 }[/math],
- [math]\displaystyle{ p(x;\theta) = \frac{\exp(\theta x) h(x)} {\int \exp(\theta y) h(y) dy} }[/math]
is a density over the reals. Evaluating the integral, we get
- [math]\displaystyle{ p(x;\theta) = \sqrt{ \frac{\lambda}{2\pi x^3} } \exp\left( - \frac{\lambda}{2x} +\theta x - \sqrt{-2\lambda\theta} \right) \mathbb{1}_{[0,\infty)}(x)\,. }[/math]
Substituting [math]\displaystyle{ \theta = -\lambda/(2\mu^2) }[/math] makes the above expression equal to [math]\displaystyle{ f(x;\mu,\lambda) }[/math].
Relationship with Brownian motion
Let the stochastic process Xt be given by
- [math]\displaystyle{ X_0 = 0\quad }[/math]
- [math]\displaystyle{ X_t = \nu t + \sigma W_t\quad\quad\quad\quad }[/math]
where Wt is a standard Brownian motion. That is, Xt is a Brownian motion with drift [math]\displaystyle{ \nu \gt 0 }[/math].
Then the first passage time for a fixed level [math]\displaystyle{ \alpha \gt 0 }[/math] by Xt is distributed according to an inverse-Gaussian:
- [math]\displaystyle{ T_\alpha = \inf\{ t \gt 0 \mid X_t=\alpha \} \sim \operatorname{IG} \left(\frac\alpha\nu, \left(\frac \alpha \sigma \right)^2 \right) = \frac{\alpha}{\sigma\sqrt{2 \pi x^3}} \exp\biggl(-\frac{(\alpha-\nu x)^2}{2 \sigma^2 x}\biggr) }[/math]
i.e
- [math]\displaystyle{ P(T_{\alpha} \in (T, T + dT)) = \frac{\alpha}{\sigma\sqrt{2 \pi T^3}} \exp\biggl(-\frac{(\alpha-\nu T)^2}{2 \sigma^2 T}\biggr)dT }[/math]
(cf. Schrödinger[3] equation 19, Smoluchowski[4], equation 8, and Folks[5], equation 1).
Expand Derivation of the first passage time distribution
|
---|
When drift is zero
A common special case of the above arises when the Brownian motion has no drift. In that case, parameter μ tends to infinity, and the first passage time for fixed level α has probability density function
- [math]\displaystyle{ f \left( x; 0, \left(\frac \alpha \sigma \right)^2 \right) = \frac \alpha {\sigma \sqrt{2 \pi x^3}} \exp\left(-\frac{\alpha^2 }{2 \sigma^2 x}\right) }[/math]
(see also Bachelier[6]:74[7]:39). This is a Lévy distribution with parameters [math]\displaystyle{ c=\left(\frac \alpha \sigma \right)^2 }[/math] and [math]\displaystyle{ \mu=0 }[/math].
Maximum likelihood
The model where
- [math]\displaystyle{ X_i \sim \operatorname{IG}(\mu,\lambda w_i), \,\,\,\,\,\, i=1,2,\ldots,n }[/math]
with all wi known, (μ, λ) unknown and all Xi independent has the following likelihood function
- [math]\displaystyle{ L(\mu, \lambda)= \left( \frac{\lambda}{2\pi} \right)^\frac n 2 \left( \prod^n_{i=1} \frac{w_i}{X_i^3} \right)^{\frac{1}{2}} \exp\left(\frac{\lambda}{\mu} \sum_{i=1}^n w_i -\frac{\lambda}{2\mu^2}\sum_{i=1}^n w_i X_i - \frac\lambda 2 \sum_{i=1}^n w_i \frac1{X_i} \right). }[/math]
Solving the likelihood equation yields the following maximum likelihood estimates
- [math]\displaystyle{ \widehat{\mu}= \frac{\sum_{i=1}^n w_i X_i}{\sum_{i=1}^n w_i}, \,\,\,\,\,\,\,\, \frac{1}{\widehat{\lambda}}= \frac{1}{n} \sum_{i=1}^n w_i \left( \frac{1}{X_i}-\frac{1}{\widehat{\mu}} \right). }[/math]
[math]\displaystyle{ \widehat{\mu} }[/math] and [math]\displaystyle{ \widehat{\lambda} }[/math] are independent and
- [math]\displaystyle{ \widehat{\mu} \sim \operatorname{IG} \left(\mu, \lambda \sum_{i=1}^n w_i \right), \qquad \frac{n}{\widehat{\lambda}} \sim \frac{1}{\lambda} \chi^2_{n-1}. }[/math]
Sampling from an inverse-Gaussian distribution
The following algorithm may be used.[8]
Generate a random variate from a normal distribution with mean 0 and standard deviation equal 1
- [math]\displaystyle{ \displaystyle \nu \sim N(0,1). }[/math]
Square the value
- [math]\displaystyle{ \displaystyle y = \nu^2 }[/math]
and use the relation
- [math]\displaystyle{ x = \mu + \frac{\mu^2 y}{2\lambda} - \frac{\mu}{2\lambda}\sqrt{4\mu \lambda y + \mu^2 y^2}. }[/math]
Generate another random variate, this time sampled from a uniform distribution between 0 and 1
- [math]\displaystyle{ \displaystyle z \sim U(0,1). }[/math]
If [math]\displaystyle{ z \le \frac{\mu}{\mu+x} }[/math] then return [math]\displaystyle{ \displaystyle x }[/math] else return [math]\displaystyle{ \frac{\mu^2}{x}. }[/math]
Sample code in Java:
public double inverseGaussian(double mu, double lambda) { Random rand = new Random(); double v = rand.nextGaussian(); // Sample from a normal distribution with a mean of 0 and 1 standard deviation double y = v * v; double x = mu + (mu * mu * y) / (2 * lambda) - (mu / (2 * lambda)) * Math.sqrt(4 * mu * lambda * y + mu * mu * y * y); double test = rand.nextDouble(); // Sample from a uniform distribution between 0 and 1 if (test <= (mu) / (mu + x)) return x; else return (mu * mu) / x; }
And to plot Wald distribution in Python using matplotlib and NumPy:
import matplotlib.pyplot as plt import numpy as np h = plt.hist(np.random.wald(3, 2, 100000), bins=200, density=True) plt.show()
Related distributions
- If [math]\displaystyle{ X \sim \operatorname{IG}(\mu,\lambda) }[/math], then [math]\displaystyle{ k X \sim \operatorname{IG}(k \mu,k \lambda) }[/math] for any number [math]\displaystyle{ k \gt 0. }[/math][1]
- If [math]\displaystyle{ X_i \sim \operatorname{IG}(\mu,\lambda)\, }[/math] then [math]\displaystyle{ \sum_{i=1}^n X_i \sim \operatorname{IG}(n \mu,n^2 \lambda)\, }[/math]
- If [math]\displaystyle{ X_i \sim \operatorname{IG}(\mu,\lambda)\, }[/math] for [math]\displaystyle{ i=1,\ldots,n\, }[/math] then [math]\displaystyle{ \bar{X} \sim \operatorname{IG}(\mu,n \lambda)\, }[/math]
- If [math]\displaystyle{ X_i \sim \operatorname{IG}(\mu_i,2 \mu^2_i)\, }[/math] then [math]\displaystyle{ \sum_{i=1}^n X_i \sim \operatorname{IG}\left(\sum_{i=1}^n \mu_i, 2 \left( \sum_{i=1}^n \mu_i \right)^2\right)\, }[/math]
- If [math]\displaystyle{ X \sim \operatorname{IG}(\mu,\lambda) }[/math], then [math]\displaystyle{ \lambda (X-\mu)^2/\mu^2X \sim \chi^2(1) }[/math].[9]
The convolution of an inverse Gaussian distribution (a Wald distribution) and an exponential (an ex-Wald distribution) is used as a model for response times in psychology,[10] with visual search as one example.[11]
History
This distribution appears to have been first derived in 1900 by Louis Bachelier[6][7] as the time a stock reaches a certain price for the first time. In 1915 it was used independently by Erwin Schrödinger[3] and Marian v. Smoluchowski[4] as the time to first passage of a Brownian motion. In the field of reproduction modeling it is known as the Hadwiger function, after Hugo Hadwiger who described it in 1940.[12] Abraham Wald re-derived this distribution in 1944[13] as the limiting form of a sample in a sequential probability ratio test. The name inverse Gaussian was proposed by Maurice Tweedie in 1945.[14] Tweedie investigated this distribution in 1956[15] and 1957[16][17] and established some of its statistical properties. The distribution was extensively reviewed by Folks and Chhikara in 1978.[5]
Numeric computation and software
Despite the simple formula for the probability density function, numerical probability calculations for the inverse Gaussian distribution nevertheless require special care to achieve full machine accuracy in floating point arithmetic for all parameter values.[18] Functions for the inverse Gaussian distribution are provided for the R programming language by several packages including rmutil,[19][20] SuppDists,[21] STAR,[22] invGauss,[23] LaplacesDemon,[24] and statmod.[25]
See also
- Generalized inverse Gaussian distribution
- Tweedie distributions—The inverse Gaussian distribution is a member of the family of Tweedie exponential dispersion models
- Stopping time
References
- ↑ Jump up to: 1.0 1.1 Chhikara, Raj S.; Folks, J. Leroy (1989), The Inverse Gaussian Distribution: Theory, Methodology and Applications, New York, NY, USA: Marcel Dekker, Inc, ISBN 0-8247-7997-5
- ↑ Seshadri, V. (1999), The Inverse Gaussian Distribution, Springer-Verlag, ISBN 978-0-387-98618-0
- ↑ Jump up to: 3.0 3.1 Schrödinger, Erwin (1915), "Zur Theorie der Fall- und Steigversuche an Teilchen mit Brownscher Bewegung" (in de), Physikalische Zeitschrift 16 (16): 289–295, https://babel.hathitrust.org/cgi/pt?id=njp.32101054770928;view=1up;seq=337
- ↑ Jump up to: 4.0 4.1 Smoluchowski, Marian (1915), "Notiz über die Berechnung der Brownschen Molekularbewegung bei der Ehrenhaft-Millikanschen Versuchsanordnung" (in de), Physikalische Zeitschrift 16 (17/18): 318–321, https://babel.hathitrust.org/cgi/pt?id=njp.32101054770928;view=1up;seq=366
- ↑ Jump up to: 5.0 5.1 Folks, J. Leroy; Chhikara, Raj S. (1978), "The Inverse Gaussian Distribution and Its Statistical Application—A Review", Journal of the Royal Statistical Society, Series B (Methodological) 40 (3): 263–275, doi:10.1111/j.2517-6161.1978.tb01039.x
- ↑ Jump up to: 6.0 6.1 Bachelier, Louis (1900), "Théorie de la spéculation" (in fr), Ann. Sci. Éc. Norm. Supér. Serie 3;17: 21–89, doi:10.24033/asens.476, http://archive.numdam.org/article/ASENS_1900_3_17__21_0.pdf
- ↑ Jump up to: 7.0 7.1 Bachelier, Louis (1900), "The Theory of Speculation", Ann. Sci. Éc. Norm. Supér. Serie 3;17: 21–89 (Engl. translation by David R. May, 2011), doi:10.24033/asens.476, https://drive.google.com/file/d/0B5LLDy7-d3SKNGI0M2E0NGItYzFlMS00NGU2LWE2ZDAtODc3MDY3MzdiNmY0/view
- ↑ Michael, John R.; Schucany, William R.; Haas, Roy W. (1976), "Generating Random Variates Using Transformations with Multiple Roots", The American Statistician 30 (2): 88–90, doi:10.1080/00031305.1976.10479147
- ↑ Shuster, J. (1968). "On the inverse Gaussian distribution function". Journal of the American Statistical Association 63 (4): 1514–1516. doi:10.1080/01621459.1968.10480942.
- ↑ Schwarz, Wolfgang (2001), "The ex-Wald distribution as a descriptive model of response times", Behavior Research Methods, Instruments, and Computers 33 (4): 457–469, doi:10.3758/bf03195403, PMID 11816448
- ↑ Palmer, E. M.; Horowitz, T. S.; Torralba, A.; Wolfe, J. M. (2011). "What are the shapes of response time distributions in visual search?". Journal of Experimental Psychology: Human Perception and Performance 37 (1): 58–71. doi:10.1037/a0020747. PMID 21090905.
- ↑ Hadwiger, H. (1940). "Eine analytische Reproduktionsfunktion für biologische Gesamtheiten". Skandinavisk Aktuarietidskrijt 7 (3–4): 101–113. doi:10.1080/03461238.1940.10404802.
- ↑ Wald, Abraham (1944), "On Cumulative Sums of Random Variables", Annals of Mathematical Statistics 15 (3): 283–296, doi:10.1214/aoms/1177731235
- ↑ Tweedie, M. C. K. (1945). "Inverse Statistical Variates". Nature 155 (3937): 453. doi:10.1038/155453a0. Bibcode: 1945Natur.155..453T.
- ↑ Tweedie, M. C. K. (1956). "Some Statistical Properties of Inverse Gaussian Distributions". Virginia Journal of Science. New Series 7 (3): 160–165.
- ↑ Tweedie, M. C. K. (1957). "Statistical Properties of Inverse Gaussian Distributions I". Annals of Mathematical Statistics 28 (2): 362–377. doi:10.1214/aoms/1177706964.
- ↑ Tweedie, M. C. K. (1957). "Statistical Properties of Inverse Gaussian Distributions II". Annals of Mathematical Statistics 28 (3): 696–705. doi:10.1214/aoms/1177706881. http://projecteuclid.org/euclid.aoms/1177706881.
- ↑ Giner, Göknur; Smyth, Gordon (August 2016). "statmod: Probability Calculations for the Inverse Gaussian Distribution". The R Journal 8 (1): 339–351. doi:10.32614/RJ-2016-024. https://journal.r-project.org/archive/2016-1.
- ↑ Lindsey, James (2013-09-09). "rmutil: Utilities for Nonlinear Regression and Repeated Measurements Models". http://www.commanster.eu/rcode.html.
- ↑ Swihart, Bruce; Lindsey, James (2019-03-04). "rmutil: Utilities for Nonlinear Regression and Repeated Measurements Models". https://cran.r-project.org/package=rmutil.
- ↑ Wheeler, Robert (2016-09-23). "SuppDists: Supplementary Distributions". https://cran.r-project.org/package=SuppDists.
- ↑ Pouzat, Christophe (2015-02-19). "STAR: Spike Train Analysis with R". https://cran.r-project.org/package=STAR.
- ↑ Gjessing, Hakon K. (2014-03-29). "Threshold regression that fits the (randomized drift) inverse Gaussian distribution to survival data". https://cran.r-project.org/package=invGauss.
- ↑ Hall, Byron; Hall, Martina; Statisticat, LLC; Brown, Eric; Hermanson, Richard; Charpentier, Emmanuel; Heck, Daniel; Laurent, Stephane et al. (2014-03-29). "LaplacesDemon: Complete Environment for Bayesian Inference". https://cran.r-project.org/package=LaplacesDemon.
- ↑ Giner, Göknur; Smyth, Gordon (2017-06-18). "statmod: Statistical Modeling". https://cran.r-project.org/web/packages/statmod/index.html.
Further reading
- Høyland, Arnljot; Rausand, Marvin (1994). System Reliability Theory. New York: Wiley. ISBN 978-0-471-59397-3.
- Seshadri, V. (1993). The Inverse Gaussian Distribution. Oxford University Press. ISBN 978-0-19-852243-0.
External links
- Inverse Gaussian Distribution in Wolfram website.
![]() | Original source: https://en.wikipedia.org/wiki/Inverse Gaussian distribution.
Read more |