Effacer les filtres
Effacer les filtres

How to avoid Inf values when writing deep learning code?

3 vues (au cours des 30 derniers jours)
ferda sonmez
ferda sonmez le 29 Mar 2019
Hi,
I wrote a deep learning code including the following Softmax function. During the training I start to get Inf values (and thus NaN values) in some matrix multiplication operations or as the result of softmax operation.
I also tried other softmax implementations which I found on the internet and books with no improvement.
Having these NaN values even in the first training epoch and in the very initial samples (such as in the 5. th sample) causes a false training of the model.
In order to simplfy my question I didn't add information related to the number of nodes in the input, output and hidden layers, cause I thing that this problem occurs independent of these numbers. If requested I may provide more info..
Best Regards,
Ferda Özdemir Sönmez
function y = Softmax(x)
ex = exp(x);
y = ex/sum(ex);
end

Réponses (0)

Catégories

En savoir plus sur Image Data Workflows dans Help Center et File Exchange

Produits


Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by