C-LSTM Input of 4D to predict 12X1 values
2 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I am trying to use a C-LSTM to predict 12 values for each time step with regression.
I'm happy to provide as much information as I can. However, to me, it seems clear that I'm providing the correct inputs for the C-LSTM architecture...
Please help! This is important medical research!
Thank you all.
I tried it with two other formats, one sort of worked (RMSE = .54)
and then I tried it with a cell array around each variable such that the training and validation are just one cell array with a time series of 528*n and kinematics is 12*n and somehow that worked
layers = [ ...
sequenceInputLayer(inputSize,'Name','input')
sequenceFoldingLayer('Name','fold')
convolution2dLayer(filterSize,numFilters,'Name','conv')
batchNormalizationLayer('Name','bn')
reluLayer('Name','relu')
sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flatten')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm')
reluLayer('Name','relu2')
fullyConnectedLayer(numFeatures*2, 'Name','fc1')
reluLayer('Name','relu3')
fullyConnectedLayer(numFeatures*2, 'Name','fc2')
reluLayer('Name','relu4')
fullyConnectedLayer(numFeatures*2, 'Name','fc3')
fullyConnectedLayer(numClasses, 'Name','fcl')
regressionLayer('Name','regression')];
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,'fold/miniBatchSize','unfold/miniBatchSize');
Here are my layers
inputsize is 528 1 1
filtersize is 64 1
numFilters is 50
numclasses is 12
0 commentaires
Réponses (1)
Srivardhan Gadila
le 28 Mar 2021
Refer to the documentation of the Input Arguments: sequences & responses of the trainNetwork function for the syntax
net = trainNetwork(sequences,responses,layers,options) to know the format of the training data.
You can execute the following code to understand the data format:
inputSize = [528 1 1];
filterSize = [64 1];
numFilters = 50;
numClasses = 12;
numHiddenUnits = 200;
numResponses = numClasses;
numFeatures = 10;
layers = [ ...
sequenceInputLayer(inputSize,'Name','input')
sequenceFoldingLayer('Name','fold')
convolution2dLayer(filterSize,numFilters,'Name','conv')
batchNormalizationLayer('Name','bn')
reluLayer('Name','relu')
sequenceUnfoldingLayer('Name','unfold')
flattenLayer('Name','flatten')
lstmLayer(numHiddenUnits,'OutputMode','sequence','Name','lstm')
reluLayer('Name','relu2')
fullyConnectedLayer(numFeatures*2, 'Name','fc1')
reluLayer('Name','relu3')
fullyConnectedLayer(numFeatures*2, 'Name','fc2')
reluLayer('Name','relu4')
fullyConnectedLayer(numFeatures*2, 'Name','fc3')
fullyConnectedLayer(numClasses, 'Name','fcl')
regressionLayer('Name','regression')];
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,'fold/miniBatchSize','unfold/miniBatchSize');
analyzeNetwork(lgraph)
%%
numTrainSamples = 50;
trainData = arrayfun(@(x)rand([inputSize(:)' 1]),1:numTrainSamples,'UniformOutput',false)';
trainLabels = arrayfun(@(x)rand(numResponses,1),1:numTrainSamples,'UniformOutput',false)';
size(trainData)
size(trainLabels)
%%
options = trainingOptions('adam', ...
'InitialLearnRate',0.005, ...
'LearnRateSchedule','piecewise',...
'Verbose',1, ...
'Plots','training-progress');
net = trainNetwork(trainData,trainLabels,lgraph,options);
0 commentaires
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!