How to change Batchsize during training

27 views (last 30 days)
sheyda Ghanbaralizadeh
sheyda Ghanbaralizadeh on 27 Dec 2021
Answered: Srivardhan Gadila on 31 Dec 2021
Hi,
I have data with size of (224,224,3,4) in 'SSCB' format. During traing, data size is required to change to ( 7,7,3,4*1024). data is divided in smaller chuncks ( with help of a 7*7 window size ) and added to Batchsize dimension.
I have tested resize2dLayer and also designed a custom layer for reshaping the data but it seems that MATLAB layers don't include Batchsize dimension as dimension of data
i.e, size of input data X in DimChangeLayer is (224,224,3) not (224,224,3,4). how can I solve it and have a output of size ( 7,7,3,4*1024). Thanks
my code:
Input_name = 'MSA_input';
input_size= [224 , 224 , 3];
C = input_size(3) ;
H = input_size(1);
W = input_size(2);
win_s=7;
n_win1 = round(H/win_s);
n_win2 = round(W/win_s);
num_Win = n_win1*n_win2;
l = [imageInputLayer(input_size, 'Name',Input_name, 'Normalization', 'none', 'NormalizationDimension', 'auto', 'DataAugmentation', 'none')];
l = [l DimChangeLayer( 'dimchanger4' , [win_s,win_s , C,num_Win )]; %
net =dlnetwork(l);
and this is my DimChangeLayer code :
classdef DimChangeLayer < nnet.layer.Layer
properties
Output_size
end
methods
function layer = DimChangeLayer(name , out_size)
% layer = DimChangeLayer
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = "change dim";
% layer otputsize
layer.Output_size = out_size;
end
function [Z] = predict(layer,X)
Z = reshape( X , layer.Output_size);
end
function [Z] = forward(layer,X)
Z = reshape( X , layer.Output_size);
end
end
end
  2 Comments
sheyda Ghanbaralizadeh
sheyda Ghanbaralizadeh on 28 Dec 2021
thanks @yanqi liu but this is dlnetwork and you can't use reshape in the middle of layers. you must use a layer that performs reshape function . as you can see I designed such layer but number of batches are not cosidered as data size so they won't be altered by reshape in DimChangeLayer.
Input_name = 'MSA_input';
input_size= [224 , 224 , 3];
C = input_size(3) ;
H = input_size(1);
W = input_size(2);
win_s=7;
n_win1 = round(H/win_s);
n_win2 = round(W/win_s);
num_Win = n_win1*n_win2;
l = [imageInputLayer(input_size, 'Name',Input_name, 'Normalization', 'none', 'NormalizationDimension', 'auto', 'DataAugmentation', 'none')];
% % l = [l resize2dLayer('OutputSize',[H*W , 1])];
% l = [l DimChangeLayer( 'dimchanger1' , [H*W , C] )];
l = [l DimChangeLayer( 'dimchanger4' , [win_s, win_s , C, 4*num_Win] )];
net =dlnetwork(l);
%% input and train
inp = randn(224,224,3);
X = zeros(224,224,3,4);
X(:,:,:,1) = inp;
X(:,:,:,2) = inp;
X(:,:,:,3) = inp;
X(:,:,:,4) = inp;
dlX = dlarray(single(X),'SSCB');
size(dlX)
outt = randn(224,224,3);
Y = zeros(224,224,3,4);
Y(:,:,:,1) = outt;
Y(:,:,:,2) = outt;
Y(:,:,:,3) = outt;
Y(:,:,:,4) = outt;
dlY= dlarray(single(Y),'SSCB');
size(dlY)
loss = dlfeval(@modelGradients,net,dlX,dlY);
function loss = modelGradients(net,dlX,dlY)
dlYPred = forward(net,dlX);
size(dlYPred)
loss = My_mse(dlYPred,dlY);
end
function my_m = My_mse(x,z)
my_m = mean((x(:)-z(:)).^2)/2;
end
this is my whole code with DimChangeLayer and input data

Sign in to comment.

Answers (1)

Srivardhan Gadila
Srivardhan Gadila on 31 Dec 2021
When we call dlnetwork to create a dlnetwork object, it validates if all the layers in the layers array are valid or not and during this process some sample inputs are passed which include, few random inputs with single batch size and few inputs with batch size greater than one. Hence your custom layer fails this check as you have one constant (4 in this case) multiplied with a variable and this should be changed to as follows:
classdef DimChangeLayer < nnet.layer.Layer
properties
Output_size
end
methods
function layer = DimChangeLayer(name , out_size)
% layer = DimChangeLayer
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = "change dim";
% layer otputsize
layer.Output_size = out_size;
end
function [Z] = predict(layer,X)
sz = layer.Output_size;
Z = reshape( X ,sz(1), sz(2), sz(3), []);
end
end
end
You can refer to this example more information: Define Custom Deep Learning Layer with Formatted Inputs. Although in the example they also inherit from nnet.layer.Formattable super class, it is not needed in this case.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by