Effacer les filtres
Effacer les filtres

how to adjust derivatives of backpropagation according to custom error function

4 vues (au cours des 30 derniers jours)
I want to implement custom error function to backpropagation algorithm, and enforce to update weights with taking into account output layer outputs.
Network consists of one hidden layer. The inputs are (n,1) vector, outputs have the same dimensions.
Lets say we have given input and target, I want to calculate error function as
E = ((input*output-target)^2)/2
so at each iteration target updates to reach initial target.
updatedTarget = input*output
According to this, derivatives are modified to
dE_dOutput(j) = sum(input .* netOutputLayerOutputs - initialTarget(j)) * input(j);
the code part of backpropagation algorithm is provided below
for i = 1:100
for j = 1 : length(input)
% forward propagation
netHiddenLayerValues(j) = sum(netHiddenLayerWeights(:,j) .* input(j)) + netBiasValue1 * 1;
netHiddenLayerOutputs(j) = 1/(1 + exp(-netHiddenLayerValues(j)));
netOutputLayerValues(j) = sum(netOutputLayerWeights(:,j) * netHiddenLayerOutputs(j)) + netBiasValue2 * 1;
netOutputLayerOutputs(j) = 1/(1 + exp(-netOutputLayerValues(j)));
% Custom target function
target(j) = input(j) * netOutputLayerOutputs(j);
% back propagation for output layer
% customized error derivative with respect to output layer outputs
dE_dOutput(j) = sum(input .* netOutputLayerOutputs - initialTarget(j)) * input(j);
%dE_dOutput(j) = -(target(j) - netOutputLayerOutputs(j)); % this is for standart mse
% partial derivative of logistic function with respect to network output value
dOutput_dNetout(j) = netOutputLayerOutputs(j) * (1 - netOutputLayerOutputs(j));
% partial derivative of network output with respect to weight
dNetout_dw(j) = netHiddenLayerOutputs(j);
% calculate total error for output layer
d_EtotalOut(j) = dE_dOutput(j) * dOutput_dNetout(j) * netHiddenLayerOutputs(j);
% calculate back propagation error for hidden layer
d_EtotalHidden_dOut(j) = dE_dOutput(j) * dOutput_dNetout(j) * netOutputLayerWeights(j);
dOut_dNetHidden(j) = netHiddenLayerOutputs(j) * (1 - netHiddenLayerOutputs(j));
dNetHidden_dw(j) = input(j);
d_EtotalHiddenOut(j) = d_EtotalHidden_dOut(j) * dOut_dNetHidden(j) * dNetHidden_dw(j);
% update weights for hidden layer
netHiddenLayerWeights(:,j) = netHiddenLayerWeights(:,j) - eta * d_EtotalHiddenOut(j);
% update weights for output layer
netOutputLayerWeights(:,j) = netOutputLayerWeights(:,j) - eta * d_EtotalOut(j);
end
end
5iter.png 10iter.png 20iter.png 50iter.png 100iter.png
The figures above shows how are changes during iterations (5,10,20,50,100 iterations respectively), outputs of net(orange) and updated target (blue). The static curves are inputs (yellow) and initial or real target (magenta).
The issue is that updated at each iteration target doesnot reach real initial target.
Please write if you notice error in algorithm implementation or logic.
The backpropagation algorithm with mse error is described here step-by-step-backpropagation-example.
Thanks.
  1 commentaire
Greg Heath
Greg Heath le 16 Jan 2019
Why in the world do you think this is better than the current performance measures?
Greg

Connectez-vous pour commenter.

Réponse acceptée

Greg Heath
Greg Heath le 4 Fév 2019
Your error function
is not at a minimum when output = target
Why did you not use the standard
E = (output - target)^2
Thank you for formally accepting my correct answer
Greg
  1 commentaire
Sergey Ohanyan
Sergey Ohanyan le 14 Fév 2019
Modifié(e) : Sergey Ohanyan le 14 Fév 2019
Thank you for answer, currently I'm trying another approach. I will write about results after completion.
Thanks.

Connectez-vous pour commenter.

Plus de réponses (1)

BERGHOUT Tarek
BERGHOUT Tarek le 3 Fév 2019
try this code : https://www.mathworks.com/matlabcentral/fileexchange/69947-back-propagation-algorithm-for-training-an-mlp?s_tid=prof_contriblnk

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by