Kullback-Leibler Divergence for NMF in Matlab
13 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
fadams18
le 3 Jan 2019
Réponse apportée : Matt Tearle
le 16 Jan 2019
I am trying to write the KLDV equation in matlab by looking at how the Euclidean distance was written.
- Euclidean distance for matrix factorization has the following structure.
which reduces to this matlab code
f = norm(X - W * H,'fro')^2
Now I have the Kullback-Leibler Divergence with structure as below
where X is the original matrix and X_hat is a product W*HI wish to write this in matlab. But I am confused how to deal with the sumation. like in the Euclidean distance suddenly we are using the function norm.
Could someone help me write a decent code for this expression? Thanks.
0 commentaires
Réponse acceptée
Matt Tearle
le 16 Jan 2019
If X and X_hat are just matrices, then I think you should be able to compute all the terms element-wise and sum the result (unless I misunderstand the formula).
div = X .* log(X ./ X_hat) - X + X_hat;
KLD = sum(div,'all'); % in R2018b onward
KLD = sum(div(:)); % in any version
I'm interpreting "log" in the formula in the math sense (natural log) rather than engineering (base 10). If it's base 10, then use the log10 function instead.
0 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!