validation error in neural network
9 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Dear friends I have tried to train a mlp using newff and train functions, but after few stages of training, the validation error stops the training procedure, so I wanna ask for any solution or alternative to prevent the validation error
regards
0 commentaires
Réponse acceptée
Greg Heath
le 11 Fév 2015
Modifié(e) : Greg Heath
le 11 Fév 2015
% 0. One hidden layer is sufficient provided there are enough hidden nodes.
% 1. Use t for target and y for output
% 2. For regression use the current FITNET instead of the obsolete NEWFF.
% 3. As indicated below, Validation Stopping is useful for preventing highly biased training data performance from influencing the design of a net which is ultimately created for use on NONDESIGN data. This is especially important when the number of training equations Ntrneq = Ntrn*O is not sufficiently greater than the number of unknown weights Nw = (I+1)*H+(H+1)*O.
% 4. Data divisions
DATA = DESIGN + NONDESIGN
DESIGN = TRAINING + VALIDATION
NONDESIGN = TEST + UNAVAILABLE
NONTRAINING = VALIDATION + NONDESIGN
% 5. Use the DESIGN TRAINING data to estimate weights. Using the same data to estimate performance can result in highly optimistic biased estimates.
% 6. Use the DESIGN VALIDATION data to obtain less biased performance estimates to
a. Prevent worse performance on NONDESIGN data.
b. Rank the performances of multiple designs
c. Choose the multiple designs used to estimate summary performance statistics
% 7. Assume the UNBIASED NONDESIGN TEST data performance is representative of the performance on UNAVAILABLE data.
% 8. Evaluate performance via the summary statistics of the UNBIASED performance on NONDESIGN (TEST + UNAVAILABLE) data chosen in 6c.
close all, clear all, clc, plt=0
x = -5:.05:5;
t = (2*sin(x).*cos(3*x)+cos(10*x)).*sin(x);
[ I N ] = size(x) % [ 1 201]
[ O N ] = size(t) % [ 1 201]
Ntst = round(0.15*N) % 30
Nval = Ntst % 30
Ntrn = N-Nval-Ntst % 141
plt = plt+1, figure(plt)
plot( x, t, 'LineWidth', 2 )
%17 local minima / 18 local maxima => 36 hidden nodes
H = 36
net = newff( x, t, H );
MSE00 = var(t',1) % 1.0041 Reference MSE
rng(4151941) % For duplicating the design
[ net tr y e ] = train (net, x, t );
hold on
plot( x, y, 'r', 'LineWidth', 2 )
NMSE = mse(e)/MSE00 % 2.5414e-3
R2 = 1-NMSE % 0.99746 Rsquared (See Wikipedia)
tr = tr % No semicolon
stopcrit = tr.stop % Validation stop
R2trn = 1-tr.best_perf/MSE00 % 0.99889
R2val = 1-tr.best_vperf/MSE000 % 0.99542
R2tst = 1-tr.best_tperf/MSE00 % 0.99277
% Using MSEtrn00 instead of MSE00 shouldn't make much difference (Use tr.trainInd to check if you don't believe me)
Hope this helps.
Thank you for formally accepting my answer
Greg
0 commentaires
Plus de réponses (1)
Greg Heath
le 9 Fév 2015
The validation stopping occurs because the net is performing badly on nontraining design data. You don't want to overcome it; You want to start designing a better net which works well on BOTH design training data (train) AND design nontraining data (validation).
I typically use a double for-loop to train at least 10 different nets for each of ~Ntrials = 10 candidate values of H = Hmin:dH:Hmax (<=Hub), the number of hidden nodes. The nets differ by the initial state of the random number generator which determines both initial weights AND the data division.
I have posted many examples in the NEWSGROUP and ANSWERS. Search on subsets of
greg, Hmax or Hub, Ntrials, fitnet or patternnet
Hope this helps,
Thank you for formally accepting my answer
Greg
1 commentaire
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!