How to avoid getting negative values when training a neural network?
Afficher commentaires plus anciens
Is there anyway to constrain the network results when we train a feed forward neural network in Matlab?
I am trying to train a supervised feed forward neural network with 100,000 observations. I have 5 continues variables and 3 countinues responses (labels). All my values are positive (labels and variables). However, when I train the network, sometimes it predicts negative results no matter what architecture I use. Negative results does not have any physical meaning and should not apear. Is there anyway to constrain the network? I also used reLU activation function for the last layer but the network cannot generalize well.
Thanks
Réponse acceptée
Plus de réponses (1)
Greg Heath
le 18 Jan 2020
0 votes
Use a sigmoid for the output layer.
Hope this helps
THANK YOU FOR FORMALLY ACCEPTING MY ANSWER
GREG
1 commentaire
Mostafa Nakhaei
le 18 Jan 2020
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!