View weights used in a neural network created after using the GUI
6 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I new to the Neural Network toolbox and am creating a network using the fitting tool with up to six inputs and one output. I have created the network using the fitting tool in the GUI and want to see which inputs are the most inportant in the training of the network. Is there a way to view the individual weights applied to each input in the network? Thanks for any help
0 commentaires
Réponse acceptée
Greg Heath
le 19 Juin 2012
If you want to rank input importance, looking at the input weights is not sufficient.
1. Standardize input and target variables (zero-mean/unit-variance)
2. Train the net using all of the variables
3. The simplest way to obtain input rankings is to rank the individual increases in MSE when each input variable row is replaced by the zero variance variable row with all values equal to the mean value of the original row.
replx(i,:) = repmat(mean(x(i,:),2), 1, N);
4. The next simplest way is to continue training the original net after the constant row replacement before measuring the increase in MSE.
5. A more convincing method is to replace each input variable row with a random permutation of it's values. This can be repeated Ntrial (e.g., Ntrials = 20) times for each row.
replx(i,:) = repmat(x(i,randperm(1:N)));
6. Same as 5 except that training is continued after each random permutation.
7. Each ranking procedure can be extended to a sequential reduction of variables by discarding the lowest ranked input after each step.
Other toolboxes contain nonneural statistical methods for ranking and reduction variables. However, using a neural approach for a neural model usually yields the best results.
Hope this helps.
Greg
5 commentaires
Greg Heath
le 24 Juin 2012
If the input variables are standardized, the mean of the rows are zero. Therefore, just use
replx(i,:) = zeros( 1, N);
instead of
replx(i,:) = repmat(mean(x(i,:),2), 1, N);
Plus de réponses (1)
Georgios Mavromatidis
le 30 Juin 2012
Hello Greg and Sara.
One quick question. If you have binary variables i.e. 0 or 1 should I normalise them as well?
1 commentaire
Greg Heath
le 1 Juil 2012
If you have binary inputs use {-1,1} and tansig hidden node activation units.
If you have independent binary outputs use {0,1} and logsig hidden node activation units.
If you have unit sum binary outputs (e.g.,classification)use {0,1} and softmax hidden node activation units.
Voir également
Catégories
En savoir plus sur Define Shallow Neural Network Architectures dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!