Training a neural network
2 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Sam harris
le 29 Juin 2012
Réponse apportée : Greg Heath
le 22 Avr 2014
Hi,
I am trying to develop a neural network which predicts an output based on 4 inputs, one of which is the output of the previous step. Currently I am just using a standard function fitting network (not a time-series prediction).
The neural network works really well (r squared approx. 0.98 - 0.99) when the output of the previous step is given independent of the neural network result.
However, when I use the neural network predicted output as the input to the next prediction, the neural network result is virtually worthless. Also, the results differ greatly every time I re-train the network - i.e. it seems the results are very dependent on the initial weights.
I am not sure if this is a problem of overtraining? Any help would be greatly appreciated.
Sam
5 commentaires
Greg Heath
le 4 Sep 2012
How many data points? How many hidden nodes? Is there a validation set for stopping? Do you get the same type of performace from a matlab demo data set?
Réponse acceptée
Greg Heath
le 22 Avr 2014
Sam harris on 2 Jul 2012
% Create a Nonlinear Autoregressive Network with External Input
% inputDelays = 1:1; feedbackDelays = 1:1; hiddenLayerSize = 10;
1. What makes you think these are appropriate inputs??
% net =narxnet(inputDelays,feedbackDelays,hiddenLayerSize);
% net.inputs{1}.processFcns = {'removeconstantrows','mapminmax'};
% net.inputs{2}.processFcns = {'removeconstantrows','mapminmax'};
2. Why bother? The last 2 statements are defaults.
% [inputs,inputStates,layerStates,targets] = preparets(net,inputSeries,{},targetSeries);
% net.divideFcn = 'dividerand'; % Divide data randomly
% net.divideMode = 'value'; % Divide up every value
3. The last 2 statements are inappropriate for time series
% net.divideParam.trainRatio = 70/100; net.divideParam.valRatio = 15/100;
% net.divideParam.testRatio = 15/100;
% net.trainFcn = 'trainlm'; % Levenberg-Marquardt
% net.performFcn = 'mse'; % Mean squared error
% net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
% 'ploterrcorr', 'plotinerrcorr'};
4. Why bother? The last 6 statements are defaults.
% % Train the Network
% [net,tr] =train(net,inputs,targets,inputStates,layerStates);
% if true % code
% end
5. What does "if true ...etc... end" suppose to do?
6. You still have to close the loop and continue training.
7. See
0 commentaires
Plus de réponses (1)
Greg Heath
le 30 Juin 2012
For the fitting net I assume you are using
x =[input(:,2:end); target(:,1:end-1)];
t = target(:,2:end);
size(input) = ?
size(target) = ?
numHidden = ?
net.divideParam = ?
R2trn ~ 0.985
R2val = ?
R2tst = ?
How is the timeseries net configured? Please include code.
Hope this helps.
Greg
3 commentaires
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!