kldiv {bayesmeta} | R Documentation |
Compute the Kullback-Leiber divergence or symmetrized KL-divergence based on means and covariances of two normal distributions.
kldiv(mu1, mu2, sigma1, sigma2, symmetrized=FALSE)
mu1, mu2 |
the two mean vectors. |
sigma1, sigma2 |
the two covariance matrices. |
symmetrized |
logical; if |
The Kullback-Leibler divergence (or relative entropy) of two probability distributions p and q is defined as the integral
D[KL](p || q) = Integral(log(p(theta)/q(theta))p(theta) d theta.
In the case of two normal distributions with mean and variance parameters given by (mu[1], Sigma[1]) and (mu[2], Sigma[2]), respectively, this results as
D[KL]( p(theta | mu[1],Sigma[1]) || p(theta | mu[2], Sigma[2]) ) = 0.5*(tr(Sigma[2]^-1 Sigma[1]) + (mu[1]-mu[2])' Sigma[2]^-1 (mu[1]-mu[2]) - d + log(det(Sigma[2]) / det(Sigma[1])))
where d is the dimension.
The symmetrized divergence simply results as
D[s](p || q) = D[KL](p || q) + D[KL](q || p).
The divergence (D[KL] >= 0 or D[s] >= 0).
Christian Roever christian.roever@med.uni-goettingen.de
S. Kullback. Information theory and statistics. John Wiley and Sons, New York, 1959.
C. Roever, T. Friede. Discrete approximation of a mixture distribution via restricted divergence. Journal of Computational and Graphical Statistics, 26(1):217-222, 2017. doi: 10.1080/10618600.2016.1276840.
bmr
.
kldiv(mu1=c(0,0), mu2=c(1,1), sigma1=diag(c(2,2)), sigma2=diag(c(3,3)))