# Skellam distribution

Parameters Probability mass functionExamples of the probability mass function for the Skellam distribution. The horizontal axis is the index k. (Note that the function is only defined at integer values of k. The connecting lines do not indicate continuity.) Cumulative distribution function ${\displaystyle \mu _{1}\geq 0,~~\mu _{2}\geq 0}$ ${\displaystyle \{\ldots ,-2,-1,0,1,2,\ldots \}}$ ${\displaystyle e^{-(\mu _{1}\!+\!\mu _{2})}\left({\frac {\mu _{1}}{\mu _{2}}}\right)^{k/2}\!\!I_{k}(2{\sqrt {\mu _{1}\mu _{2}}})}$ ${\displaystyle \mu _{1}-\mu _{2}\,}$ N/A ${\displaystyle \mu _{1}+\mu _{2}\,}$ ${\displaystyle {\frac {\mu _{1}-\mu _{2}}{(\mu _{1}+\mu _{2})^{3/2}}}}$ ${\displaystyle 1/(\mu _{1}+\mu _{2})\,}$ ${\displaystyle e^{-(\mu _{1}+\mu _{2})+\mu _{1}e^{t}+\mu _{2}e^{-t}}}$ ${\displaystyle e^{-(\mu _{1}+\mu _{2})+\mu _{1}e^{it}+\mu _{2}e^{-it}}}$

The Skellam distribution is the discrete probability distribution of the difference ${\displaystyle K_{1}-K_{2}}$ of two correlated or uncorrelated random variables ${\displaystyle K_{1}}$ and ${\displaystyle K_{2}}$ having Poisson distributions with different expected values ${\displaystyle \mu _{1}}$ and ${\displaystyle \mu _{2}}$. It is useful in describing the statistics of the difference of two images with simple photon noise, as well as describing the point spread distribution in certain sports where all scored points are equal, such as baseball, hockey and soccer.

Only the case of uncorrelated variables will be considered in this article. See Karlis & Ntzoufras, 2003 for the use of the Skellam distribution to describe the difference of correlated Poisson-distributed variables.

Note that the probability mass function of a Poisson distribution with mean μ is given by

${\displaystyle f(k;\mu )={\mu ^{k} \over k!}e^{-\mu }\,}$

The Skellam probability mass function is the cross-correlation of two Poisson distributions: (Skellam, 1946)

${\displaystyle f(k;\mu _{1},\mu _{2})=\sum _{n=-\infty }^{\infty }\!f(k\!+\!n;\mu _{1})f(n;\mu _{2})}$
${\displaystyle =e^{-(\mu _{1}+\mu _{2})}\sum _{n=-\infty }^{\infty }{{\mu _{1}^{k+n}\mu _{2}^{n}} \over {n!(k+n)!}}}$
${\displaystyle =e^{-(\mu _{1}+\mu _{2})}\left({\mu _{1} \over \mu _{2}}\right)^{k/2}I_{k}(2{\sqrt {\mu _{1}\mu _{2}}})}$

where I k(z) is the modified Bessel function of the first kind. The above formulas have assumed that any term with a negative factorial is set to zero. The special case for ${\displaystyle \mu _{1}=\mu _{2}(=\mu )}$ is given by (Irwin, 1937):

${\displaystyle f\left(k;\mu ,\mu \right)=e^{-2\mu }I_{k}(2\mu )}$

Note also that, using the limiting values of the Bessel function for small arguments, we can recover the Poisson distribution as a special case of the Skellam distribution for ${\displaystyle \mu _{2}=0}$.

## Properties

The Skellam probability mass function is of course normalized:

${\displaystyle \sum _{k=-\infty }^{\infty }f(k;\mu _{1},\mu _{2})=1.}$

We know that the generating function for a Poisson distribution is:

${\displaystyle G\left(t;\mu \right)=e^{\mu (t-1)}.}$

It follows that the generating function ${\displaystyle G(t;\mu _{1},\mu _{2})}$ for a Skellam probability function will be:

${\displaystyle G(t;\mu _{1},\mu _{2})=\sum _{k=0}^{\infty }f(k;\mu _{1},\mu _{2})t^{k}}$
${\displaystyle =G\left(t;\mu _{1}\right)G\left(1/t;\mu _{2}\right)\,}$
${\displaystyle =e^{-(\mu _{1}+\mu _{2})+\mu _{1}t+\mu _{2}/t}.}$

Notice that the form of the generating function implies that the distribution of the sums or the differences of any number of independent Skellam-distributed variables are again Skellam-distributed.

It is sometimes claimed that any linear combination of two Skellam-distributed variables are again Skellam-distributed, but this is clearly not true since any multiplier other than +/-1 would change the support of the distribution.

The moment-generating function is given by:

${\displaystyle M\left(t;\mu _{1},\mu _{2}\right)=G(e^{t};\mu _{1},\mu _{2})}$
${\displaystyle =\sum _{k=0}^{\infty }{t^{k} \over k!}\,m_{k}}$

which yields the raw moments mk . Define:

${\displaystyle \Delta \ {\stackrel {\mathrm {def} }{=}}\ \mu _{1}-\mu _{2}\,}$
${\displaystyle \mu \ {\stackrel {\mathrm {def} }{=}}\ (\mu _{1}+\mu _{2})/2.\,}$

Then the raw moments mk are

${\displaystyle m_{1}=\left.\Delta \right.\,}$
${\displaystyle m_{2}=\left.2\mu +\Delta ^{2}\right.\,}$
${\displaystyle m_{3}=\left.\Delta (1+6\mu +\Delta ^{2})\right.\,}$

The central moments M k are

${\displaystyle M_{2}=\left.2\mu \right.,\,}$
${\displaystyle M_{3}=\left.\Delta \right.,\,}$
${\displaystyle M_{4}=\left.2\mu +12\mu ^{2}\right..\,}$

The mean, variance, skewness, and kurtosis excess are respectively:

${\displaystyle \left.\right.E(n)=\Delta \,}$
${\displaystyle \sigma ^{2}=\left.2\mu \right.\,}$
${\displaystyle \gamma _{1}=\left.\Delta /(2\mu )^{3/2}\right.\,}$
${\displaystyle \gamma _{2}=\left.1/2\mu \right..\,}$

The cumulant-generating function is given by:

${\displaystyle K(t;\mu _{1},\mu _{2})\ {\stackrel {\mathrm {def} }{=}}\ \ln(M(t;\mu _{1},\mu _{2}))=\sum _{k=0}^{\infty }{t^{k} \over k!}\,\kappa _{k}}$

which yields the cumulants:

${\displaystyle \kappa _{2k}=\left.2\mu \right.}$
${\displaystyle \kappa _{2k+1}=\left.\Delta \right..}$

For the special case when μ1 = μ2, an asymptotic expansion of the modified Bessel function of the first kind yields for large μ:

${\displaystyle f(k;\mu ,\mu )\sim {1 \over {\sqrt {4\pi \mu }}}\left[1+\sum _{n=1}^{\infty }(-1)^{n}{\{4k^{2}-1^{2}\}\{4k^{2}-3^{2}\}\cdots \{4k^{2}-(2n-1)^{2}\} \over n!\,2^{3n}\,(2\mu )^{n}}\right]}$

(Abramowitz & Stegun 1972, p. 377). Also, for this special case, when k is also large, and of order of the square root of 2μ, the distribution tends to a normal distribution:

${\displaystyle f(k;\mu ,\mu )\sim {e^{-k^{2}/4\mu } \over {\sqrt {4\pi \mu }}}.}$

These special results can easily be extended to the more general case of different means.

## References

• Abramowitz, M. and Stegun, I. A. (Eds.). 1972. Modified Bessel functions I and K. Sections 9.6–9.7 in Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 9th printing, pp. 374–378. New York: Dover.
• Irwin, J. O. 1937. The frequency distribution of the difference between two independent variates following the same Poisson distribution. Journal of the Royal Statistical Society: Series A 100 (3): 415–416. [1]
• Karlis, D. and Ntzoufras, I. 2003. Analysis of sports data using bivariate Poisson models. Journal of the Royal Statistical Society: Series D (The Statistician) 52 (3): 381–393. doi:10.1111/1467-9884.00366
• Karlis D. and Ntzoufras I. (2006). Bayesian analysis of the differences of count data . Statistics in Medicine 25, 1885-1905. [2]
• Skellam, J. G. 1946. The frequency distribution of the difference between two Poisson variates belonging to different populations. Journal of the Royal Statistical Society: Series A 109 (3): 296. [3]