Generalization in the ANN
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
A network was created with 2000 records and aslo 5 networks were created with 400 records for each of them.(Note that the input data with 2000 records was divided to build 5 individual networks)Now the performance of each subnetworks is better than larg networks with 2000 records.Can we conclude that the larg network learnt too much from the examples given during the training, thus loosing the capability to generalize on the basis of new examples (overfitting)? but the small networks performed better because they had the less training records? Thanks a lot for any advice.
0 commentaires
Réponses (1)
Greg Heath
le 22 Oct 2016
> Can we conclude that the larger network learnt too much from the examples given during the training, thus loosing the capability to generalize on the basis of new examples (overfitting)? but the small networks performed better because they had the less training records?
ABSOLUTELY NOT!
Results heavily depend on how the data is divided. For example: randomly vs by sections.
You apparently misunderstand the concepts of overfitting and overtraining:
OVERFITTING: There are more unknown weights than training equations. This allows an infinite number of minima for training data ( How many solutions {x1,x2} are there for the problem x1+x2 = 1 ?!) which are not minima for nontraining (i.e., validation and testing) data.
OVERTRAINING: Training an overfit network beyond the point where performance on NONTRAINING data begins to deteriorate.
As long as all data is representative of the general I/O mapping, the more data, the better. That is why random datadivision is the default in MATLAB NN training programs.
Hope this helps.
Thank you for formally accepting my answer
Greg
3 commentaires
Greg Heath
le 26 Oct 2016
No: IF the data are representative of the general I/O mapping.
This is easy to check:
Compare the performance of each net on all of the data.
Voir également
Catégories
En savoir plus sur Pattern Recognition and Classification dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!