Training Network stopping automatically after 3 iteration without showing any error.
2 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens

tspan = 0:0.001:10;
y0 = 0;
[t,y] = ode45(@(t,y) t^2+2, tspan, y0);
T=t(1:0.9*end)
Y=y(1:0.9*end)
x=t(0.9*end+1:end)
v=y(1+0.9*end:end)
layer = functionLayer((@(X) X./(1 -X^2)))
layers = [
sequenceInputLayer(1)
fullyConnectedLayer(1)
tanhLayer
functionLayer(((@(t) t./(1 -t.^2))),Description="softsign")
fullyConnectedLayer(1)
tanhLayer
functionLayer(((@(t) t./(1 -t.^2))),Description="softsign")
regressionLayer]
options = trainingOptions('adam', ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropFactor',0.2, ...
'LearnRateDropPeriod',5, ...
'miniBatchSize',20,.....
'VerboseFrequency',1,...
'ValidationPatience',Inf,...
'MaxEpochs',100, ...
'Plots','training-progress')
net = trainNetwork(T',Y',layers,options);
ypre=predict(net,tspan);
plot(ypre)
plot(y)
0 commentaires
Réponses (1)
Prateek Rai
le 22 Fév 2022
Hi,
Training of the network stopped because training loss is NaN. This implies that the predictions using the output network might contain NaN values.
On analyzing network, I found that size of the all the layers is 1*1*1 which is why NaN values are coming.
You might want to recheck the dimension of the layers of the network using:
analyzeNetwork(layers)
1 commentaire
Image Analyst
le 5 Avr 2022
I get the same error trying to train on 448 images and my layers are not 1*1*1 -- they're 227x227x3

Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!