How to implement weighted classification for 1D CNN?

3 vues (au cours des 30 derniers jours)
Ioana Cretu
Ioana Cretu le 20 Mai 2021
I developed a 1D CNN for arrhythmia classification. I first segmented the long ECG signals from the MIT arrhythmia database in segments of 300 datapoints. Then, taking into account the comment made by by Joss Knigh at this question I reshaped the signals along the 4th dimension like that:
  • Xtraining: 1-by-300-by-1-by-91147; where 300 is the length of each signal and 91147 are the number of signals
  • Ytraining: 91147-by-1- which contains the labels for 5 classes of all the 91147 samples (these are categorical)
The problem is that the dataset is highly imbalanced and I think that's the reason why I cannot obtain a training accuracy bigger than 80%. The normal class represent approx 84% of the dataset, whereas the other 4 classes are in minority. Thus, I want to apply a weightedClassificationLayer. I read the documentation for custom layer, but I do not understand how I should set the dimensions of the input (N). Considering that I treat my signals as images I was using the examples given in the documentation, but the results became worse when I applied it.
Can you please explain how are these dimension chosen? How could I make it work?
Thank you in advance!
I calculated the weights as:
classWeights = 1./countcats(Ytrain);
classWeights = classWeights'/mean(classWeights);
And the weightedClassificationLayer that I used is:
classdef weightedClassificationLayer < nnet.layer.ClassificationLayer
% Row vector of weights corresponding to the classes in the
% training data.
function layer = weightedClassificationLayer(classWeights, name)
% layer = weightedClassificationLayer(classWeights) creates a
% weighted cross entropy loss layer. classWeights is a row
% vector of weights corresponding to the classes in the order
% that they appear in the training data.
% layer = weightedClassificationLayer(classWeights, name)
% additionally specifies the layer name.
% Set class weights.
layer.ClassWeights = classWeights;
% Set layer name.
if nargin == 2
layer.Name = name;
% Set layer description
layer.Description = 'Weighted cross entropy';
function loss = forwardLoss(layer, Y, T)
% loss = forwardLoss(layer, Y, T) returns the weighted cross
% entropy loss between the predictions Y and the training
% targets T.
N = size(Y,4);
Y = squeeze(Y);
T = squeeze(T);
W = layer.ClassWeights;
loss = -sum(W*(T.*log(Y)))/N;
% function dLdY = backwardLoss(layer, Y, T)
% % dLdY = backwardLoss(layer, Y, T) returns the derivatives of
% % the weighted cross entropy loss with respect to the
% % predictions Y.
% % Find observation and sequence dimensions of Y
N = size(Y,4);
W = layer.ClassWeights;
dLdY = -(W'.*T./Y)/N;
% end

Réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by