Effacer les filtres
Effacer les filtres

net reusing

1 vue (au cours des 30 derniers jours)
Luca Cavazzana
Luca Cavazzana le 13 Déc 2011
Hi, I'm creating a set of nets using a code more or less like this:
net = patternnet(10);
% net initialization
% ...
for ii=1:10
net = train(net,in,tar);
netSet{ii} = net;
end
What I see is the first training takes relatively a lot of time, while the others usually are far faster. So a question comes to my mind: could be each new training starts from the previous net? If so, is there a function to re-randomize the initial weights (without the need to allocate a new net)?

Réponse acceptée

the cyclist
the cyclist le 13 Déc 2011
Inside your loop, the input to train() is definitely the value of "net" that just calculated in the previous. I am guessing you want to do something like this instead:
net0 = patternnet(10);
% net0 initialization
% ...
for ii=1:10
net = train(net0,in,tar);
netSet{ii} = net;
end
This way, you are starting from the initialized value "net0" each time. I don't know enough about neural nets to know if train() has some randomness in it that will make each value of netSet(ii) different.
Also, note that you do not really need the intermediate value "net" inside your loop. You could just do
for ii=1:10
netSet{ii} = train(net0,in,tar);
end
  1 commentaire
Luca Cavazzana
Luca Cavazzana le 16 Déc 2011
yes, but the real code is a bit more complex than the one I wrote here, each net has to be trained until a minimum performance rate is obtained, so the "intermediate net" becomes convenient.
Another side effect that comes to my mind is every time I call |train| a new combination of train, validation and test sets are chosen, so probably during the Nth |train| the function validates using some of the data in the (N-k)th one was used for training. After a lot of iterations probably all the data will be somehow used to train the net, causing overfitting.
(sorry for overusing the word "train"...)

Connectez-vous pour commenter.

Plus de réponses (2)

Greg Heath
Greg Heath le 16 Déc 2011
Patternnet is selfinitializing. Therefore
1. Initialize the rand RNG
2. Create an outer loop over nH number of candidate values for H, the number of hidden nodes
3. Create an inner loop of Ntrials random initialization designs.
4. Bottom Line: Each of the nH*Ntrials designs begins with a different set of initial weights.
Hope this helps.
Greg
  3 commentaires
Greg Heath
Greg Heath le 18 Déc 2011
No, weight randomization only occurs when the net is created
( e.g., net = newff(...))
Greg
Greg Heath
Greg Heath le 18 Déc 2011
No, weight randomization only occurs when the net is created
( e.g., net = newff(...))
Greg

Connectez-vous pour commenter.


Greg Heath
Greg Heath le 17 Fév 2014
With the current set of net creation functions ( e.g., fitnet, patternnet, feedforwardnet,...) weight initialization does not occur at net creation.
If weights have not been assigned, they will be automatically initialized by train. Otherwise, train will just use the existing weights.
Therefore, when designing multiple nets in a loop, the function configure must be used to initialize the nets at the top of the loop before train is called.
net = configure(net, x, t);

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by