- Change the MiniBatch SIze to 114
- Convert Xcell, Ycell, and XTestCell to double type
How can I solve mini-batch size issue in my LSTM network?
12 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Tan Wei Xian
le 25 Jan 2021
Réponse apportée : Rohit Pappu
le 29 Jan 2021
Hi,currently I am trying to develop an weather prediction neural network, My plan is to let the system take the 12 hours historical data to forecast 1 hour in the future. Thus, I used LSTM to predict the weather but there is one issue that keep bothering me, My LSTM keep complaining about the mini-batch size and I fail to understand the issue at here. Furthermore, I want to know how to let LSTM to take 12 hours historical data to forecast one hour(I assume time step is the key at here so I set the time step as 12 but I am not certain). The data is already provided.Thanks in advance!.
I already tried this solution but the mini-batch size issue still appear:
Here's the code snippet:
%Read the table
data = readtable('hourly_data.csv');
%extract the hourly data in 2016
data_2016 = data(65761:74544, :);
%plot each features
stackedplot(data_2016, {'tempC', 'windspeedKmph', 'humidity', 'cloudcover','precipMM'})
trainingset = data_2016(:,{'tempC', 'windspeedKmph', 'humidity', 'cloudcover','precipMM'});
numTimeStepsTrain = floor(0.8*height(trainingset));
dataTrain = trainingset(1:numTimeStepsTrain,:);
dataTest = trainingset(numTimeStepsTrain+1:end,:);
XTrain = dataTrain(1:end-1,1:4);
YTrain = dataTrain(2:end,5);
XTest = dataTest(1:end-1,1:4);
YTest = dataTest(2:end,5);
XTrain = table2array(XTrain);
YTrain = table2array(YTrain);
XTest = table2array(XTest);
YTest = table2array(YTest);
mu = mean(XTrain);
sig = std(XTrain);
XTrain = (XTrain - mu) / sig;
YTrain = (YTrain - mu) / sig;
XTest = (XTest - mu) / sig;
YTest = (YTest - mu) / sig;
[r,c] = size(XTrain);
[m,n] = size(XTest);
Xcell = cell(r,1);
for i = 1:r
Xcell{i} = transpose(XTrain(i,1:end));
end
Ycell = cell(r,1);
for i = 1:r
Ycell{i} = YTrain(i,1:end);
end
XTestcell = cell(m,1);
for i = 1:1756
XTestcell{i} = XTest(i,1:end);
end
YTestcell = cell(m,1);
for i = 1:1756
YTestcell{i} = YTest(i,1:end);
end
numFeatures = 1;
numResponses = 1;
numHiddenUnits = 50;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits)
fullyConnectedLayer(numResponses)
regressionLayer];
options = trainingOptions('adam', ...
'MaxEpochs',10, ...
'GradientThreshold',1, ...
'InitialLearnRate',0.005, ...
'LearnRateSchedule','piecewise', ...
'MiniBatchSize',12,...
'LearnRateDropPeriod',125, ...
'LearnRateDropFactor',0.2, ...
'Verbose',0, ...
'Plots','training-progress');
net = trainNetwork(Xcell,Ycell,layers,options);
YPred = [];
net = predictAndUpdateState(net,Xcell);
stepsAhead = 12; % you can use 1,2,3,4 on any value of steps ahead
for i = 2:stepsAhead+1
[net,YPred(:,i)] = predictAndUpdateState(net,XTestcell(:,i-1),"SequenceLength",114);
end
0 commentaires
Réponse acceptée
Rohit Pappu
le 29 Jan 2021
A plausible solution would be to
Xcell = [Xcell{:}];
Ycell = [Ycell{:}];
XTestcell = [XTestcell{:}];
YPred = [];
net = predictAndUpdateStatey(net,Xcell);
stepsAhead = 12; % you can use 1,2,3,4 on any value of steps ahead
for i = 2:stepsAhead+1
[net,YPred(:,i)] = predictAndUpdateState(net,XTestcell(:,i-1),"SequenceLength",114,"MiniBatchSize",114);
end
0 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!