kldiv {bayesmeta}R Documentation

Kullback-Leibler divergence of two multivariate normal distributions.

Description

Compute the Kullback-Leiber divergence or symmetrized KL-divergence based on means and covariances of two normal distributions.

Usage

  kldiv(mu1, mu2, sigma1, sigma2, symmetrized=FALSE)

Arguments

mu1, mu2

the two mean vectors.

sigma1, sigma2

the two covariance matrices.

symmetrized

logical; if TRUE, the symmetrized divergence will be returned.

Details

The Kullback-Leibler divergence (or relative entropy) of two probability distributions p and q is defined as the integral

D[KL](p || q) = Integral(log(p(theta)/q(theta))p(theta) d theta.

In the case of two normal distributions with mean and variance parameters given by (mu[1], Sigma[1]) and (mu[2], Sigma[2]), respectively, this results as

D[KL]( p(theta | mu[1],Sigma[1]) || p(theta | mu[2], Sigma[2]) ) = 0.5*(tr(Sigma[2]^-1 Sigma[1]) + (mu[1]-mu[2])' Sigma[2]^-1 (mu[1]-mu[2]) - d + log(det(Sigma[2]) / det(Sigma[1])))

where d is the dimension.

The symmetrized divergence simply results as

D[s](p || q) = D[KL](p || q) + D[KL](q || p).

Value

The divergence (D[KL] >= 0 or D[s] >= 0).

Author(s)

Christian Roever christian.roever@med.uni-goettingen.de

References

S. Kullback. Information theory and statistics. John Wiley and Sons, New York, 1959.

C. Roever, T. Friede. Discrete approximation of a mixture distribution via restricted divergence. Journal of Computational and Graphical Statistics, 26(1):217-222, 2017. doi: 10.1080/10618600.2016.1276840.

See Also

bmr.

Examples

kldiv(mu1=c(0,0), mu2=c(1,1), sigma1=diag(c(2,2)), sigma2=diag(c(3,3)))

[Package bayesmeta version 3.2 Index]