- Thank you for formally accepting my answer. *
How to train feedforward network to solve XOR function
5 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
im new in matlab, please sorry if its stupid question. and sorry my english.
trying to train feedforward network to solve XOR function
1 hidden layer with 2 neurons, other settings are default: TANSIG, Backprop, TRAINLM, LEARNGDM, MSE
R2012b matlab version
close all, clear all, clc, format compact
p = [0 1 0 1 ; 0 0 1 1];
t = [0 1 1 0];
net = feedforwardnet(2,'trainlm');
net = train(net,p,t);
a = net(p)
ive tried this code, and tried 'nntool' and 'nnstart' too. its always seems like training algorithm splits 'p' set for
2 - training set,
1 - validation set,
1 - testing set
as a result - network is training on partial data (2 pair of digits instead 4), and training process generates Validation done or Minimum gradient reached (1.00e-010) in very few iteration (1-10 iterations) and simulation shows that network untrained.
- Is my guess right (about splitting 'p' set)?
- how i can manually give validation data (input and output sets) to training algorithm?
- should i somehow expand 'p' and 't' sets, and then use divideblock?
- any other ideas?
thanx!
0 commentaires
Réponse acceptée
Greg Heath
le 16 Fév 2013
Modifié(e) : Greg Heath
le 16 Fév 2013
1.[ I N ] = size(x) % [ 2 4 ]
[ O N ] = size(t) % [ 1 4 ]
Neq = prod(size(t) % 4 = No. of training equations
2. For tthis small data set it doesn't make sense to use data division for validation stopping. So,
net.divideFcn = 'dividetrain'; % or equivalently, = ' ';
3. Since the No. of estimated weights for H hidden nodes is
%Nw = (I+1)*O = 3 for H=0
%Nw = (I+1)*H+(H+1)*O for H >0
the condition Neq >= Nw yields the following upper bound for H
Hub = (Neq-O)/(I+O+1) % 3/4
which is only possible for H = 0 (no hidden layer). However from a 2-dimensional plot we know that it will take at least 2 hidden nodes to separate the "0" class diagonal corners [ 0 1; 0 1 ] from the "1" class diagonal corners [ 1 0 ; 0 1].
Subsequently, for H = 2, Nw = 9 > Neq = 4. Therefore, there will be an infinite number of solutions for the weights.
net = patternnet(2); % for classification
4. Choose MSEgoal so that the coefficient of determination ( or R^2, see wikipedia) is >= 0.99 . Then the model will represent at least 99% of the biased target variance:
net.trainParam.goal = 0.01*var(t',1);
5. The success of the design depends on the placement of the random initial weights. Therefore it may be necessary to make Ntrials >= 10 separate designs (use a do loop).
6. When training the net use the extended output form
[ net tr y e ] = train(net,x,t);
Then, everything you need to know, besides the output y and error e, can be obtained directly from the training structure tr.
7. It is STRONGLY recommended that somewhere along the line you should investigate the contents of tr.
Hope this helps.
Greg
2 commentaires
Greg Heath
le 17 Fév 2013
Modifié(e) : Greg Heath
le 17 Fév 2013
>net = feedforwardnet(2,'trainlm');
net = patternnet(2); % for classification
net = fitnet(2); % for regression
net = feedforward(2); % NEVER
>net = train(net,p,t);
>a = net(p)
[ net, tr, a ] = train(net,p,t);
NMSE = tr.perf/var(t')
R2 = 1- NMSE
>but 1 of 10 experiment network randomly falls into some local minimum and cant get out. Number of iteration goes to 250 - 500 iteration and breaks on minimum gradient reached, untrained.
No fault on your part. This is normal. That is why precisely why you have to design multiple nets.
In general, you would rank the nets by their validation error and predict generaliztion error by using the test set error on the best net chosen by the validation set.
Plus de réponses (4)
Albert
le 16 Fév 2013
2 commentaires
Greg Heath
le 17 Fév 2013
Modifié(e) : Greg Heath
le 17 Fév 2013
Never use max_fail above 10.
Do you understand it's function?
You don't need to change min_grad ... Something else is wrong.
Sarita Ghadge
le 15 Sep 2017
clc; close all; clear all;
P=[0 0 1 1; 0 1 0 1]; T=[0 1 1 0];
net= feedforwardnet(200);% 200-hidden layer
net.trainFcn = 'trainbr';
net.divideFcn = 'dividetrain';
[net, tr]= train(net,P,T)
a=net(P(:,1))
a=net(P(:,2))
a=net(P(:,3))
a=net(P(:,4))
it works for exor using feedforwardnet with >=150 hidden layer
0 commentaires
ga
le 21 Mai 2024
Train the neural network using a two-input XOR gate knowing the initial values:
w1 = 0.9;
w2 = 1,8;
b = - 0.9;
Requirements achieved:
Analyze the steps to train a perceptron neural network.
Training programming using Matlab software.
Use nntool for survey and analysis
0 commentaires
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!