When I should stop training a neural network?
Afficher commentaires plus anciens
I'm working in a neural network with BackPropagation. The network has 6 inputs, 1 hidden layer (6 neurons on that layer) and 1 output. I train the network with algorithms "Levenberg-Marquardt" and "Bayesian Regularization". So, the idea is can "predict" a result but the results are not the right ones according to the table with the historical data.
To stop the training, for the moment, I look the "regression plot", the "Mean squared Error" and "Regression R Values", wich have the "ideal values" but still the results are not accurates and are not even close with datas who "doesn't exist" in the table with the historical data.
What graphic should I look at to know the network is not overfitting or is correctly trained?
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!