clamp cross-entropy loss
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Matt Fetterman
le 3 Sep 2020
Commenté : Matt Fetterman
le 6 Sep 2020
the Matlab cross-entropy loss has this form:
loss = -sum(W*(T.*log(Y)))/N;
I would like to "clamp" it so that the log function output is bounded, for example it cannot be less than 100.
Can we do it?
0 commentaires
Réponse acceptée
David Goodmanson
le 3 Sep 2020
Modifié(e) : David Goodmanson
le 6 Sep 2020
Hi Matt,
z = log(Y);
z(z<100) = 100;
loss = -sum(W*(T.*z))/N;
In the link you provided, they talk about a limit of -100 rather than +100. The former appears to make more sense. Lots of possibilities for a smooth differentiable cutoff, here is one, assuming Y>=0
Ylimit = -100;
loss = -sum(W*(T.*log(Y+exp(Ylimit)))/N;
3 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Statistics and Machine Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!