Introduction
The generalized inverse Gaussian is a kind of rich probability distribution, and its parameters are derived from some classical useful distributions when the parameter is a certain value.
Generalized inverse Gaussian distribution (generalized inverse Gaussian distribution)
The probability density function of the generalized inverse Gaussian distribution is:
Among them, KP is the second class modified Bessel function of a>0 and b>0 (Modified Bessel function of the second kind).
In particular, it is important to note that the support set is x>0, i.e. for nonnegative random variables.
The second class of modified Bessel functions satisfies the following properties:
Gamma distribution (gamma distribution)
When the above generalized inverse Gaussian distribution is b=0,r>0,a>0, it is called the gamma distribution.
Note as X~ga (P,A/2), where p is called shape parameters, A/2 is called a scale parameter.
Its practical definition and concept is to assume that the random variable x is the waiting time to wait for the occurrence of the P-piece.
Gamma distribution satisfies the additive, when two random variables obey the gamma distribution, independent of each other, and the frequency in the unit time is the same, gamma distribution is additive.
When P=1, the gamma distribution becomes an exponential distribution (exponential distribution).
Inverse gamma distribution (inverse gamma distribution)
The parameter a=0,r<0,b>0 of the generalized inverse Gaussian distribution is called inverse gamma distribution.
Here's τ=-p, credited as IG (Τ,B/2).
Inverse Gaussian distribution (inverse Gaussian distribution)
The parameter p=-1/2 of generalized inverse Gaussian distribution is called inverse Gaussian distribution.
The significance of priori distribution and posterior distribution conjugate
The parameter estimation from Bayesian angle is the process of finding the maximum posteriori estimation. It is a mathematical technique to make the calculation simple, and the integral process of the most posterior probability is transformed into the process of finding the majority (mode) of the posterior distribution, which requires that the prior distribution and the posterior distribution are the same form but the parameters are different distributions (that is, the prior distribution and the posterior distribution are conjugate relations).
Instance One
Assume a random variable x~bernoulli (θ), 0<θ<1.
Because the value of θ is between (0,1), it is natural to think that the beta distribution is defined in that interval, so give Theta a Beta distribution as a priori, Θ~beta (β).
Post-Test distribution
This is also a beta distribution.
Example Two
Assuming a random variable x~n (0,λ), here we set the variance σ^2 to λ, where λ>0.
(1) We assume that λ satisfies the gamma distribution, Λ~ga (Λ|R,Α/2).
In this way, subsequent inspection
This is a generalized inverse Gaussian distribution, and of course we can also think of the gamma distribution as a generalized inverse Gaussian distribution, but this is more troublesome.
(2) We assume that λ satisfies the inverse gamma distribution, Λ~ig (Τ,Β/2).
After the test
This is also a inverse gamma distribution.
This facilitates calculation.
Resources
Wiki: generalized inverse Gaussian distribution
Wiki: Bessel functions
Wiki: Gamma distribution
Reprint please indicate the author Jason Ding and its provenance
GitHub Blog Home page (http://jasonding1354.github.io/)
CSDN Blog (http://blog.csdn.net/jasonding1354)
Jane Book homepage (http://www.jianshu.com/users/2bd9b48f6ea8/latest_articles)
"Mathematics in machine learning" generalized inverse Gaussian distribution and its special cases