Custom Weighted Classification Layer: Chnage in input value size
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I am trying to change the input vlaue size from [1 1 x] to [50 50 x] (here x =1 or 2 or 3 or so on) in weightedClassificationLayer, but its giving error. I need to know where I can do changes so taht this function accept the different input values.
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
properties
% Row vector of weights corresponding to the classes in the
% training data.
ClassWeights
end
methods
function layer = weightedClassificationLayer(classWeights, name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name
% Set class weights.
layer.ClassWeights = classWeights;
% Set layer name.
if nargin == 2
layer.Name = name;
end
% Set layer description
layer.Description = 'Weighted cross entropy';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
N = size(Y,4);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
loss = -sum(W*(T.*log(Y)))/N;
end
function dLdY = backwardLoss(layer, Y, T)
% dLdX = backwardLoss(layer, Y, T) returns the derivatives of
% the weighted cross entropy loss with respect to the
% predictions Y.
[H,Wi,K,N] = size(Y);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
dLdY = -(W'.*T./Y)/N;
dLdY = reshape(dLdY,[H Wi K N]);
end
end
end
to check the validity of this layer
classWeights = [0 1]
Size=size(classWeights)
layer = weightedClassificationLayer(classWeights);
numClasses = numel(classWeights)
validInputSize = [1 1 numClasses]
checkLayer(layer,validInputSize, 'ObservationDimension',4);
this works on Valid input size =[1 1 numClasses] but I am trying to change it to [x x numClasses] (x any number greater than 1)
0 commentaires
Réponse acceptée
Divya Gaddipati
le 22 Oct 2019
As I understand, you want to change the validInputSize to “[x x numClasses]”, which implies a single prediction will be of size [x x 1]. Hence, you also need to reshape your classWeights (which is W in your code) to [x x numClasses].
Modified your code below:
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
properties
% Row vector of weights corresponding to the classes in the
% training data.
ClassWeights
end
methods
function layer = weightedClassificationLayer(classWeights, name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name
% Set class weights.
layer.ClassWeights = classWeights;
% Set layer name.
if nargin == 2
layer.Name = name;
end
% Set layer description
layer.Description = 'Weighted cross entropy';
end
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
N = size(Y,4);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
%% Modified %%
n = length(T(:))/2;
W = repelem(W, 1, [n n]);
W = reshape(W, size(T));
prod = W.*(T.*log(Y));
loss = -sum(prod(:))/N;
%% Modified %%
end
function dLdY = backwardLoss(layer, Y, T)
% dLdX = backwardLoss(layer, Y, T) returns the derivatives of
% the weighted cross entropy loss with respect to the
% predictions Y.
[H,Wi,K,N] = size(Y);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
%% Modified %%
n = length(T(:))/2;
W = repelem(W, 1, [n n]);
W = reshape(W, size(T));
%% Modified %%
dLdY = -(W.*T./Y)/N;
dLdY = reshape(dLdY,[H Wi K N]);
end
end
end
5 commentaires
Divya Gaddipati
le 22 Oct 2019
Generally, such large sizes (like 256, 512) are not recommended to use for checkLayer. To speed up the tests, specify a smaller valid input size.
You can find more information here: https://www.mathworks.com/help/deeplearning/ref/checklayer.html#d117e116206
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Build Deep Neural Networks dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!