Adapt (Neural Network ToolBox ) - Error using .* Matrix dimensions must agree
8 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I am using a feedforward neural network to represent the Q-function of a Q Learning Application.
net = feedforwardnet([20 10]);
I am attempting to use incremental updates via the adapt function
net = adapt(net,s,t);
but get the following error message:
Error using .*
Matrix dimensions must agree.
Error in nn7.grad2 (line 104)
gN{i} = Fdot .* gA{i};
Error in adaptwb>adapt_network (line 100)
[gB,gIW,gLW] = nn7.grad2(net,[],PD(:,:,ts),BZ,IWZ,LWZ,N,Ac(:,ts+AcInd),gE,Q,1,hints);
Error in adaptwb (line 37)
[out1,out2,out3] = adapt_network(in1,in2,in3,in4);
Error in network/adapt (line 108)
[net,Ac,tr] = feval(net.adaptFcn,net,Pd,T,Ai);
Error in QLearningImplementation1_3_17 (line 88)
net = adapt(net,s,t);
My inputs and outputs are as follows:
Inputs (s): 14x20 matrix
Targets (t): 5x20 matrix
I believe that the gradient descent function is throwing the error because the output layer is linear, but all of the built in function in MATLAB are pretty cryptic and I'm having trouble figuring out where the error is actually coming from.
I've tried the following (without success):
using con2seq to convert the input and target data types to sequences
using 'train' instead (I'm not generating enough data per iteration, and the network just refuses to train).
Thanks in advance!
EDIT: RESOLVED This issue was caused by initializing the network using:
net = configure(net,x_init,t_init);
prior to using adapt. I did this because I wanted to be able to get initial estimates (albeit inaccurate ones) on my first iteration. I changed the code so that for the first iteration the actions are entirely random (Epsilon = 0) and therefore the Q value estimates do not come from the neural network until the second iteration. This resolved the issue.
0 commentaires
Réponses (1)
Greg Heath
le 4 Jan 2017
0. NOTE: Quotation marks are only used for emphasis
1. Generally, N = 20 "N"umber of instances is not enough to adequately define an I = 14-dimensional "I"nput distribution. Typically, more than 3 instances per dimension are necessary and 10 to 30 per dimension are usually adequate. (e.g., How many 2-dimensional points are needed to adequately characterize a noisy parabola?)
2. N = 20 instances of an O=5 dimensional "O"utput target yields
Neq = N*O = 100 equations
3. The MATLAB default number of training instances is Ntrn = 0.7*N resulting in
Ntrneq = 0.7*Neq = 70 training equations
4. The number of unknown weights in a feedforward I-H1-H2-O feedforward( [ H1, H2 ] ) net with H1 = 20, H2 = 10 "H"idden nodes trained with
[ I N ] = size( Input) = [ 14 20 ]
[ O N ] = size( OutputTarget) = [ 5 20 ]
is
Nw = (I+1)*H1+(H1+1)*H2+(H2+1)*O
= 15*20 + 21*10 + 11*5
= 300 + 210+ 55 =565 unknowns
5. Obviously, 70 equations is insufficient for 565 unknowns.
6. Even using all of the data only yields 100 equations.
7. Bottom line: If you cannot get a lot more more data, DRASTICALLY reduce the number of hidden nodes.
8. For example: use one hidden layer and reduce the number of hidden nodes so that there are at least as many equations as unknown weights.
9. Ordinarily that would mean
0.7*N*O >= (I+1)*H+(H+1)*O
or H <= Hub (upperbound) where
Hub = (0.7*N*O-O)/(I+O+1) = 69/20 = 3.45
10. Therefore, first try a single hidden layer with 3 hidden nodes.
11. There are various combinations of alternatives to consider like
a. Use TRAINBR with no validation set
b. Create additional data points by adding random noise to the input data
Hope this helps.
Thank you for formally accepting my answer
Greg
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!