Effacer les filtres
Effacer les filtres

Replace a layer on LSTM

2 vues (au cours des 30 derniers jours)
Jesus Balado Frias
Jesus Balado Frias le 17 Juin 2020
Dear all,
I am trying to create a weithed LSTM to Sequence-to-sequence classification. So first I created de LSTM.
numFeatures = 1;
numHiddenUnits = 200;
numClasses = 11;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits,'OutputMode','sequence')
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer('Name','classoutput')];
Then I created the weighted layer as I found in:
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
properties
% (Optional) Layer properties.
ClassWeights
end
methods
function layer = weightedClassificationLayer(classWeights,name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
%
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name.
% Set class weights
layer.ClassWeights = classWeights;
% Set layer name
if nargin == 2
layer.Name = name;
end
% Set layer description
layer.Description = 'Weighted cross entropy';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
% Find observation and sequence dimensions of Y
[~, N, S] = size(Y);
% Reshape ClassWeights to KxNxS
W = repmat(layer.ClassWeights(:), 1, N, S);
% Compute the loss
loss = -sum( W(:).*T(:).*log(Y(:)) )/N;
end
function dLdY = backwardLoss(layer, Y, T)
% dLdY = backwardLoss(layer, Y, T) returns the derivatives of
% the weighted cross entropy loss with respect to the
% predictions Y.
% Find observation and sequence dimensions of Y
[~, N, S] = size(Y);
% Reshape ClassWeights to KxNxS
W = repmat(layer.ClassWeights(:), 1, N, S);
% Compute the derivative
dLdY = -(W.*T./Y)/N;
end
end
end
And
classWeights =[0.05 0.05 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1 0.1];
wLayer = weightedClassificationLayer(classWeights);
But when I try to replace the classification layer following the CNN example:
layers = replaceLayer(layers,"classoutput",wLayer);
It appears the error:
Check for missing argument or incorrect argument data type in call to function 'replaceLayer'
Can anyone help me?

Réponse acceptée

Ayush Laddha
Ayush Laddha le 19 Juin 2020
From your explanation, I do infer that you wanted to replace the layer using replaceLayer function. The error message clearly states the reason for error as well. You receive that error because the first argument to the replaceLayer function should be a layerGraph whereas you are trying to provide a layers array to it. You should convert your layers array to a layerGraph object and then use the replaceLayer function.
You can follow the documentation for further reference –
  1 commentaire
Jesus Balado Frias
Jesus Balado Frias le 25 Juin 2020
Yes, thanks

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by