Strange neural network output

1 vue (au cours des 30 derniers jours)
minomic
minomic le 22 Juin 2015
Commenté : minomic le 23 Juin 2015
Hi, I am trying to use the Neural Network Toolbox but I have troubles in calculating the output of a network. I will try to explain my problem: I have defined a very simple ANN with one hidden layer and linear activation functions. So if I have an input x, then I expect the output of the hidden layer to be
h = w * x + b
where w are the weights and b the biases. Then I expect my output to be
o = w' * h + b'
where w' are the weights between the hidden layer and the output and b' the biases.
Now the problem is that if I do
o = net(x)
this doesn't happen. Here is my code:
net = feedforwardnet([layer1], 'traincgp');
net = configure(net, Dtrain, Dtrain);
net.trainParam.epochs = 0;
net.IW{1,1} = weights12;
net.LW{2,1} = my_weights;
net.b{1} = bias12;
for ii=1:size(net.layers, 1)
net.layers{ii}.transferFcn = 'purelin';
end;
net = train(net, Dtrain, Dtrain);
As you can see I am training for 0 epochs since this is just a test and I am also using Dtrain both as input and target since I am training an autoencoder. As I said, the problem is that if I calculate the output as I wrote before I get one result, while if I do
output = net(input)
I get another one. What should I do to have the same result?

Réponse acceptée

Greg Heath
Greg Heath le 23 Juin 2015
Modifié(e) : Greg Heath le 23 Juin 2015
Just modify the following
close all, clear all, clc, tic
[ x, t ] = simplefit_dataset;
[ I N ] = size(x), [O N ] = size(t)
net = fitnet;
net.input.processFcns = { 'removeconstantrows' };
net.output.processFcns = { 'removeconstantrows' };
rng('default')
net = train(net,x,t);
NMSE1 = mse(t-net(x))/var(t) % 1.7057e-05
IW = net.IW{1,1} % [ 10 1 ]
b1 = net.b{1} % [ 10 1 ]
b2 = net.b{2} % [ 1 1 ]
LW = net.LW{2,1} % [ 1 10 ]
B1 = repmat(b1,I,N) % [ 10 94 ]
B2 = repmat(b2,O,N) % [1 94 ]
y = B2+LW*tanh(B1+IW*x); % [1 94 ]
NMSE2 = mse(t-y)/var(t) % 1.7057e-05
Hope this helps.
Thank you for formally accepting my answer
I will let you figure out how to handle
1. The default mapminmax normalization
2. Multiple inputs and outputs.
  1 commentaire
minomic
minomic le 23 Juin 2015
Thank you for your answer. In the meantime I managed to solve the problem by avoiding using 'feedforwardnet' but building the network from scratch with the function 'network'. Anyway I am going to accept this answer since I am sure it works as well.
Cheers,
minomic

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by