Neural Network .. the results are completely better when using (for loop) as compared when constructing and training each net individually.
3 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hi everyone; In my work I need to try different ANN topologies, preprocessing methods, activation functions, No. of layers, training functions, and so on. The purpose is to improve the final results of the regression model. I used for loop as in below; the results when using for loop is completely better than when constructing and training each network individually (I had tried to use one case of the output file for one network and do it manually, I found the result was very bad for the same network parameters as the RMSE was high). The question is whether my result is accurate or not? is it possible to use for a loop when you want to try hundreds of networks to find the best solution?
for i=1:4 % Network Topology loop FFANN,.. for typePrepro=1:7 % Preprocessing M loop zscore,...
for layer=1:5 % Max number of Network layers
NoNeurns=ones(1,layer);
ActFn={};
for fn = 1:layer
ActFn{fn} = ActivationFn_New{1};
end
for layerLoop=1: layer % Cycle for the chosen no. of layers
for matrix=1:X % Matrix for no of ActFun and NoLayers
ActFn{layerLoop}=M1{matrix};
if layerLoop ~= layer
NoNeurns(layerLoop)=M2(matrix);
else
NoNeurns(layerLoop)=1;
end
for TrainFn=1:9 %Training Functions (trainlm,...
for TrainMthd=1:2 % Training Methods (Adapt,...
Thanks
0 commentaires
Réponses (1)
Greg Heath
le 23 Nov 2017
I have designed hundreds of nets in the NEWSGROUP and ANSWERS using a double loop approach.
The outer loop is over number of hidden nodes,
j = 0
for h = Hmin:dH:Hmax
j = j + 1
...
net = fitnet(h);
...
the inner loop is over Ntrials subsets of random initial weights
for i = 1:Ntrials
net = configure(net,x,t);
[ net tr y e ] = train( net, x ,t );
...
NMSE(i,j) = mse(t-y)/mean(var(t',1));
end
end
Search words
Hmin Hmax Ntrials
should dredge up most of them.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 commentaires
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!