Mixed Poisson distribution
Notation | [math]\displaystyle{ \operatorname{Pois}(\lambda) \, \underset{\lambda}\wedge \, \pi(\lambda) }[/math] | ||
---|---|---|---|
Parameters | [math]\displaystyle{ \lambda\in (0, \infty) }[/math] | ||
Support | [math]\displaystyle{ k \in \mathbb{N}_0 }[/math] | ||
pmf | [math]\displaystyle{ \int\limits_0^\infty \frac{\lambda^k}{k!}e^{-\lambda} \,\,\pi(\lambda)\,\mathrm d\lambda }[/math] | ||
Mean | [math]\displaystyle{ \int\limits_0^\infty \lambda \,\,\pi(\lambda)\,d\lambda }[/math] | ||
Variance | [math]\displaystyle{ \int\limits_0^\infty (\lambda+(\lambda-\mu_\pi)^2) \,\,\pi(\lambda) \, d\lambda }[/math] | ||
Skewness | [math]\displaystyle{ \Bigl(\mu_\pi+\sigma_\pi^2\Bigr)^{-3/2} \,\Biggl[\int\limits_0^\infty[(\lambda-\mu_\pi)^3 + 3(\lambda-\mu_\pi)^2]\, \pi(\lambda) \, d{\lambda}+\mu_\pi\Biggr] }[/math] | ||
MGF | [math]\displaystyle{ M_\pi(e^t-1) }[/math], with [math]\displaystyle{ M_\pi }[/math] the MGF of π | ||
CF | [math]\displaystyle{ M_\pi(e^{it}-1) }[/math] | ||
PGF | [math]\displaystyle{ M_\pi(z-1) }[/math] |
A mixed Poisson distribution is a univariate discrete probability distribution in stochastics. It results from assuming that the conditional distribution of a random variable, given the value of the rate parameter, is a Poisson distribution, and that the rate parameter itself is considered as a random variable. Hence it is a special case of a compound probability distribution. Mixed Poisson distributions can be found in actuarial mathematics as a general approach for the distribution of the number of claims and is also examined as an epidemiological model.[1] It should not be confused with compound Poisson distribution or compound Poisson process.[2]
Definition
A random variable X satisfies the mixed Poisson distribution with density π(λ) if it has the probability distribution[3]
- [math]\displaystyle{ \operatorname{P}(X=k) = \int_0^\infty \frac{\lambda^k}{k!}e^{-\lambda} \,\,\pi(\lambda)\,\mathrm d\lambda. }[/math]
If we denote the probabilities of the Poisson distribution by qλ(k), then
- [math]\displaystyle{ \operatorname{P}(X=k) = \int_0^\infty q_\lambda(k) \,\,\pi(\lambda)\,\mathrm d\lambda. }[/math]
Properties
- The variance is always bigger than the expected value. This property is called overdispersion. This is in contrast to the Poisson distribution where mean and variance are the same.
- In practice, almost only densities of gamma distributions, logarithmic normal distributions and inverse Gaussian distributions are used as densities π(λ). If we choose the density of the gamma distribution, we get the negative binomial distribution, which explains why this is also called the Poisson gamma distribution.
In the following let [math]\displaystyle{ \mu_\pi=\int\limits_0^\infty \lambda \,\,\pi(\lambda) \, d\lambda\, }[/math] be the expected value of the density [math]\displaystyle{ \pi(\lambda)\, }[/math] and [math]\displaystyle{ \sigma_\pi^2 = \int\limits_0^\infty (\lambda-\mu_\pi)^2 \,\,\pi(\lambda) \, d\lambda\, }[/math] be the variance of the density.
Expected value
The expected value of the mixed Poisson distribution is
- [math]\displaystyle{ \operatorname{E}(X) = \mu_\pi. }[/math]
Variance
- [math]\displaystyle{ \operatorname{Var}(X) = \mu_\pi+\sigma_\pi^2. }[/math]
Skewness
The skewness can be represented as
- [math]\displaystyle{ \operatorname{v}(X) = \Bigl(\mu_\pi+\sigma_\pi^2\Bigr)^{-3/2} \,\Biggl[\int_0^\infty(\lambda-\mu_\pi)^3\,\pi(\lambda)\,d{\lambda}+\mu_\pi\Biggr]. }[/math]
Characteristic function
The characteristic function has the form
- [math]\displaystyle{ \varphi_X(s) = M_\pi(e^{is}-1).\, }[/math]
Where [math]\displaystyle{ M_\pi }[/math] is the moment generating function of the density.
Probability generating function
For the probability generating function, one obtains[3]
- [math]\displaystyle{ m_X(s) = M_\pi(s-1).\, }[/math]
Moment-generating function
The moment-generating function of the mixed Poisson distribution is
- [math]\displaystyle{ M_X(s) = M_\pi(e^s-1).\, }[/math]
Examples
Theorem — Compounding a Poisson distribution with rate parameter distributed according to a gamma distribution yields a negative binomial distribution.[3] Proof Let [math]\displaystyle{ \pi(\lambda)=\frac{(\frac{p}{1-p})^r}{\Gamma(r)} \lambda^{r-1} e^{-\frac{p}{1-p}\lambda} }[/math] be a density of a [math]\displaystyle{ \operatorname{\Gamma}\left(r,\frac{p}{1-p}\right) }[/math] distributed random variable. [math]\displaystyle{ \begin{align} \operatorname{P}(X=k)&= \frac{1}{k!} \int_0^\infty \lambda^k e^{-\lambda} \frac{(\frac{p}{1-p})^r}{\Gamma(r)} \lambda^{r-1} e^{-\frac{p}{1-p}\lambda} \,\mathrm d \lambda \\ & = \frac{p^r(1-p)^{-r}}{\Gamma(r) k!} \int_0^\infty \lambda^{k+r-1} e^{-\lambda \frac{1}{1-p}} \,\mathrm d \lambda \\ & = \frac{p^r(1-p)^{-r}}{\Gamma(r) k!} (1-p)^{k+r} \underbrace{\int_0^\infty \lambda^{k+r-1} e^{-\lambda} \,\mathrm d \lambda}_{= \Gamma(r+k)} \\ & = \frac{\Gamma(r+k)}{\Gamma(r) k!} (1-p)^k p^r \end{align} }[/math] Therefore we get [math]\displaystyle{ X\sim\operatorname{NegB}(r,p). }[/math] |
Theorem — Compounding a Poisson distribution with rate parameter distributed according to a exponential distribution yields a geometric distribution. Proof Let [math]\displaystyle{ \pi(\lambda)=\frac1\beta e^{-\frac \lambda\beta} }[/math] be a density of a [math]\displaystyle{ \mathrm{Exp}\left(\frac1\beta\right) }[/math] distributed random variable. Using integration by parts n times yields: [math]\displaystyle{ \begin{align} \operatorname{P}(X=k)&=\frac{1}{k!}\int\limits_0^\infty \lambda^k e^{-\lambda} \frac1\beta e^{-\frac \lambda\beta} \, \mathrm d\lambda\\ &=\frac{1}{k!\beta}\int\limits_0^\infty \lambda^k e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\,\mathrm d \lambda\\ &=\frac{1}{k!\beta}\cdot k!\left(\frac{\beta}{1+\beta}\right)^k\int\limits_0^\infty e^{-\lambda\left(\frac{1+\beta}{\beta}\right)}\,\mathrm d \lambda\\ &=\left(\frac{\beta}{1+\beta}\right)^k\left(\frac{1}{1+\beta}\right) \end{align} }[/math] Therefore we get [math]\displaystyle{ X\sim\operatorname{Geo\left(\frac{1}{1+\beta}\right)}. }[/math] |
Table of mixed Poisson distributions
mixing distribution | mixed Poisson distribution[4] |
---|---|
gamma | negative binomial |
exponential | geometric |
inverse Gaussian | Sichel |
Poisson | Neyman |
generalized inverse Gaussian | Poisson-generalized inverse Gaussian |
generalized gamma | Poisson-generalized gamma |
generalized Pareto | Poisson-generalized Pareto |
inverse-gamma | Poisson-inverse gamma |
log-normal | Poisson-log-normal |
Lomax | Poisson–Lomax |
Pareto | Poisson–Pareto |
Pearson’s family of distributions | Poisson–Pearson family |
truncated normal | Poisson-truncated normal |
uniform | Poisson-uniform |
shifted gamma | Delaporte |
beta with specific parameter values | Yule |
Literature
- Jan Grandell: Mixed Poisson Processes. Chapman & Hall, London 1997, ISBN 0-412-78700-8 .
- Tom Britton: Stochastic Epidemic Models with Inference. Springer, 2019, doi:10.1007/978-3-030-30900-8
References
- ↑ Willmot, Gordon E.; Lin, X. Sheldon (2001), "Mixed Poisson distributions", Lundberg Approximations for Compound Distributions with Insurance Applications, Lecture Notes in Statistics (New York, NY: Springer New York) 156: pp. 37–49, doi:10.1007/978-1-4613-0111-0_3, ISBN 978-0-387-95135-5, http://link.springer.com/10.1007/978-1-4613-0111-0_3, retrieved 2022-07-08
- ↑ Willmot, Gord (1986). "Mixed Compound Poisson Distributions" (in en). ASTIN Bulletin 16 (S1): S59–S79. doi:10.1017/S051503610001165X. ISSN 0515-0361.
- ↑ 3.0 3.1 3.2 3.3 Willmot, Gord (2014-08-29). "Mixed Compound Poisson Distributions". Astin Bulletin 16: 5–7. doi:10.1017/S051503610001165X.
- ↑ Karlis, Dimitris; Xekalaki, Evdokia (2005). "Mixed Poisson Distributions". International Statistical Review 73 (1): 35–58. doi:10.1111/j.1751-5823.2005.tb00250.x. ISSN 0306-7734. https://www.jstor.org/stable/25472639.
Original source: https://en.wikipedia.org/wiki/Mixed Poisson distribution.
Read more |