KL divergence between two multivariate Gaussians

Version 1.0.2 (1.67 KB) by Statovic
Function to efficiently compute the Kullback-Leibler divergence between two multivariate Gaussian distributions.
110 Downloads
Updated 26 Feb 2021

View License

This function computes the Kullback-Leibler (KL) divergence between two multivariate Gaussian distributions with specified parameters (mean and covariance matrix). The covariance matrices must be positive definite. The code is efficient and numerically stable.

Examples:
1) Compute the KL divergence between two univariate Gaussians: KL( N(-1,1) || N(+1,1) )
mu1 = -1; mu = +1;
s1 = 1; s2 = 1;
mvgkl(mu1, s1^2, mu2, s2^2)

2) Compute the KL divergence between two bivariate Gaussians: KL( N(mu1,S1) || N(mu2,S2) )
mu1 = [-1 -1]'; mu2 = [+1, +1]';
S1 = [1 0.5; 0.5 1]; S2 = [1 -0.7; -0.7 1];
mvgkl(mu1, S1, mu2, S2)

Cite As

Statovic (2024). KL divergence between two multivariate Gaussians (https://www.mathworks.com/matlabcentral/fileexchange/87899-kl-divergence-between-two-multivariate-gaussians), MATLAB Central File Exchange. Retrieved .

MATLAB Release Compatibility
Created with R2020b
Compatible with any release
Platform Compatibility
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Version Published Release Notes
1.0.2

-minor edits; title slightly changed

1.0.1

-minor description edits

1.0.0