- If it’s underfitting it can be the case that model is too simple for the data or the data is uncleaned and contains lot of noise. In either case the training loss would be high. Increasing the model’s parameters or go for even better models or more cleaned data might improve the accuracy.
- Further decrease of learning rate also sometimes helps further decreasing the loss effectively.
- Try tuning more hyperparameters, try to change optimizers, adding regularization on your loss, etc.
LSTM High Training Loss not decreasing and remain constant
43 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I need to classify my signals into 4 classes, I am feeding 600 .mat files for every class with 360x1 data points each, I seperated training-test as 70-30.
"surface"
"floating"
"corona" "void"
%%%define structure
layers = [ ...
sequenceInputLayer(360)
bilstmLayer(200,'OutputMode','last') %increased from 100
fullyConnectedLayer(4)
softmaxLayer
classificationLayer
]
options = trainingOptions('adam', ...
'MaxEpochs',10, ...
'MiniBatchSize', 150, ...
'InitialLearnRate', 0.001, ...
'SequenceLength', 1000, ...
'GradientThreshold', 1, ...
'ExecutionEnvironment',"auto",...
'plots','training-progress', ...
'shuffle','every-epoch',...
'Verbose',false);
The code I took reference is from this mathwork prioject: Classify ECG Signals Using Long Short-Term Memory Networks
Despite training many times, my accuracy would not exceed 75%, the class "corona" always shows least accuracy compare to other class, these are some of the training option I have tuned one by one:
- LSTM/BiLSTMLayer : LSTM layer accuracy will remain flat at 25%, only using BiLSTM I can see an increase
- NumHiddenUnits: increase from 100 -> 200 only slight increase in accuracy with extended training time
- Mini Batch Size: change to 128/256 but shows an decrease
- Initial Learn Rate: 0.01 -> 0.001, huge increase from 25% to 75%
- Sequence Length: 1000->100, training takes longer time
- Shuffle every epoch: training takes longer time
I am suspecting the high loss that I have is the cause. Are there any ways I can make the training loss drop?
0 commentaires
Réponses (1)
Varun Sai Alaparthi
le 13 Jan 2023
Hello Elaine,
There can be many reasons for constant high training loss without much drop. More information on validation loss and accuracy would be helpful to trace down the issue. Some possible solutions that can work to reduce the loss and increase the training accuracy would be
Sometimes this can be the upper limit for those architectures for the given data too.
I hope this information helps and please reach out for any further issues.
Sincerely
Varun
0 commentaires
Voir également
Catégories
En savoir plus sur Pattern Recognition and Classification dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!