Hi,
When I train any neural network i get the warning below. It still trains usable networks but I'd like to know what the warning means.
% Warning: 'trainRation' is not a legal parameter.
% > In nntest.param>do_test (line 63)
% In nntest.param (line 6)
% In network/subsasgn>setDivideParam (line 1838)
% In network/subsasgn>network_subsasgn (line 460)
% In network/subsasgn (line 14)
% In NN_Training (line 78)
I'm using the function below to train the networks but I don't know why trainRation is causing the warning.
net = fitnet(current_neuron_count, TRAIN_FCN);
net.divideParam.trainRation = 70/100;
net.divideParam.valRation = 15/100;
net.divideParam.testRation = 15/100;
Thanks

 Réponse acceptée

Walter Roberson
Walter Roberson le 9 Mar 2019

0 votes

trainRatio, valRatio, testRatio
no final 'n'. Not trainRation, trainRatio

3 commentaires

Mohamed Abdelsamie
Mohamed Abdelsamie le 9 Mar 2019
Thanks Walter!
Would that cause problems in how the data is divided?
Walter Roberson
Walter Roberson le 9 Mar 2019
The versions with 'Ration' would have had those commands ignored, leaving you with the default ratios.
Mohamed Abdelsamie
Mohamed Abdelsamie le 9 Mar 2019
Thanks a lot Walter!

Connectez-vous pour commenter.

Plus de réponses (1)

alsharif taha
alsharif taha le 5 Déc 2020

0 votes

when i train this network i get errors
please help me
clc
clear
close all
p=[1:10 10:10:100];
t= (p.^2);
net=newff(p,t,[3], {'logsig' 'purelin'});
net.divideParam.trainRatio=1;
net.divideParam.testRatio=0;
net.divideParam.valRatio=0;
net.divideParam.lr=0.01;
net.divideParam.min_grad=1e-20;
net.divideParam.goal=1e-30;
net.divideParam.epochs=300;
net=train(net,p,t);
plot([1:100] .^2,'x')
hold on
plot(round(net(1:100)),'o')
plot(p,t, '*g')
legend('real target', 'output of net', 'training samples', 'location', 'north west')
the error msgs are:
Warning: 'min_grad' is not a legal parameter.
Warning: 'min_grad' is not a legal parameter.
Warning: 'min_grad' is not a legal parameter.
although i defined the epochs to 300 while training continues to reach 1000 epochs
i do not know why ? pls help me

1 commentaire

Walter Roberson
Walter Roberson le 5 Déc 2020
min_grad is for https://www.mathworks.com/help/deeplearning/ref/traingdx.html not for divideParam

Connectez-vous pour commenter.

Catégories

En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange

Produits

Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by