Effacer les filtres
Effacer les filtres

problem with neural network training

2 vues (au cours des 30 derniers jours)
Mohamad
Mohamad le 15 Déc 2013
Modifié(e) : KAE le 15 Mai 2019
I have read in some references that if we add small and different random noises to the neural network input data at each epoch of the learning process the generalization of this net will improve(jitter). I would like to implement this but since I do not know the number of epochs beforehand I have to check the convergence of my net after every epoch which makes the problem too complicated. Do you have any suggestion to solve this problem? best

Réponse acceptée

Greg Heath
Greg Heath le 17 Déc 2013
Adding noise after each epoch does not sound like a very productive method.
Given the number of hidden nodes, design many nets in a double loop. The outer loop varies the level of noise added to the training data. The inner loop is used to design Ntrials nets with different random initial weights.
One or more good designs can be obtained from the numlevel*Ntrials candidates using the validation set error as a measure of performance. Final unbiased estimates of generalization performance are then obtained from the test set performances.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 commentaire
KAE
KAE le 15 Mai 2019
Modifié(e) : KAE le 15 Mai 2019
I have seen the technique of adding noise to inputs listed as a technique for data 'augmentation'. So I wanted to confirm: I will still have the same number N of inputs, just with a given level of noise added to them in each loop, correct? Rather than for example 2N inputs which concatenates the data with and without the noise?

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by