Neural network with limited datasets

1 vue (au cours des 30 derniers jours)
rIznaldi
rIznaldi le 18 Juin 2014
Commenté : rIznaldi le 11 Juil 2014
Hi all,
I am developing back-propagation neural network to classify the incidence of crisis (crisis=1; non-crisis=0) with 15 covariates (a set of macro and economic indicators). I have annual datasets 1970-2012 (42 observations) which I consider it is considerably small for this exercise.
My questions are:
1. Is it okay to proceed for the BP simulation with small datasets and relatively high number of covariates?
2. When I run the simulation, the result keeps changes overtime (In fact, it has similar datasets, training and test data). I just curious why is it happening?
3. Any idea what is the most appropriate classification method to handle small datasets?
Your responses are highly appreciated.
Thanks

Réponse acceptée

Greg Heath
Greg Heath le 18 Juin 2014
Modifié(e) : Greg Heath le 18 Juin 2014
[ I N ] = size(inputs) % [ 15 42 ]
[ O N ] = size(targets) % [ 1 42 ]
Ntrn = N -2*round(0.15*N) % 30 default (6 val and 6 test)
Ntrneq = N*O % 30 training equations
%For an I-H-O node topology, the number of unknown weights is
Nw = (I+1)*H+(H+1)*O
% Therefore, Ntrneq > Nw <==> H <= Hub where
Hub = -1+ceil((Ntrneq-O)/(I+O+1)) % 9
Try to minimize H while achieving an adjusted R-squared >= 0.99. I have posted many examples. Search on
greg patternnet Ntrials R2a
You may also wish to use 10-fold crossvalidation to obtain more precise estimates of error rates.
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 commentaires
Greg Heath
Greg Heath le 18 Juin 2014
The variety of results that you experienced result from the default randomness of trn/val/tst data division and random initial weights. Initializing the RNG to a specified initial state will yield repeatable results.
rIznaldi
rIznaldi le 11 Juil 2014
Thank you Greg for your answer.

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by