How to create a non learnable layer for upsampling in Matlab?
12 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Hello Matlab-Community,
I am currently trying to deploy a UNET model created and trained in Keras in Matlab using the function importKerasLayers.
However, in the Matlab version I am using (R2018b) there is no UpSampling2D (see https://keras.io/layers/convolutional/) layer available. Thus, these layers are replaced by a placeholderLayer.
The function replaceLayer can replace such placeholderLayer's in the network graph by any priorly defined layers. As far as I can tell, Matlab does not offer such a simple upsampling layer which does repeat the existing values (without interpolation and without learning anything).
Does anybody know, how to define an easy upsampling layer (as done in Keras by UpSampling2D(size = (2,2))) and include it into the network graph?
Thanks in advance for your help.
Best regards
Adidue
0 commentaires
Réponses (1)
Hazar Benan Unal
le 16 Nov 2019
Modifié(e) : Hazar Benan Unal
le 4 Sep 2020
Since I myself had a hard time figuring out the implementation of upsampling layer in MATLAB, below is the code to do 3D and 2D upsampling in case someone needs in future. Hopefully 2020 version will have this commonly used layer.
%3D Upsampling layer.
%%Example: Input: 6x8x10x7 ---> Output: 12x16x20x7
%Number of filters needs to be the same as number of channels
%in the previous layer.
num_filters = 7
%Set learning rate factors to zero since upsampling layer does not learn any parameters.
upsample = transposedConv3dLayer([2,2,2],num_filters,'Stride',[2,2,2],...
'WeightLearnRateFactor',0,'BiasLearnRateFactor',0, 'Name', 'upsampling_3D');
upsample.Weights = zeros(2,2,2,num_filters,num_filters);
upsample.Bias = zeros(1,1,1,num_filters);
%First filter should act only on the first channel in the previous layer.
%Second filter should act only on the second channel in the previous layer and so on.
%So we need to 'turn on' only the corresponding filter for the channel of interest.
for j = 1:num_filters
upsample.Weights(:,:,:,j,j) = 1;
end
%%If you don't do this for loop and simply set all weights to 1 with,
%upsample.Weights = ones(2,2,2,nf1,nf1);
%upsample.Bias = zeros(1,1,1,nf1);
%Then each filter acts on ALL of the channels in the previous layer and gives you wrong result.
Similarly, for 2D Upsampling:
%2D Upsampling layer.
%%Example: Input: 6x8x7 ---> Output: 12x16x7
%Number of filters needs to be the same as number of channels
%in the previous layer.
num_filters = 7
%Set learning rate factors to zero since upsampling layer does not learn any parameters.
upsample = transposedConv2dLayer([2,2],num_filters,'Stride',[2,2],...
'WeightLearnRateFactor',0,'BiasLearnRateFactor',0, 'Name', 'upsampling_2D');
upsample.Weights = zeros(2,2,num_filters,num_filters);
upsample.Bias = zeros(1,1,nf1);
%First filter should act only on the first channel in the previous layer.
%Second filter should act only on the second channel in the previous layer and so on.
%So we need to 'turn on' only the corresponding filter for the channel of interest.
for j = 1:num_filters
upsample.Weights(:,:,j,j) = 1;
end
And a toy example for visual verification:
%3D toy example
input_size = [6,6,4,3];
num_filters = input_size(4);
upsample = transposedConv3dLayer([2,2,2],num_filters,'Stride',[2,2,2],...
'WeightLearnRateFactor',0,'BiasLearnRateFactor',0, 'Name', 'ups_3d');
upsample.Weights = zeros(2,2,2,num_filters,num_filters);
upsample.Bias = zeros(1,1,1,num_filters);
for j = 1:num_filters
upsample.Weights(:,:,:,j,j) = 1;
end
layers = [image3dInputLayer(input_size, 'Name', 'input'),
upsample,
regressionLayer('Name', 'reg')];
layers(1).Mean = 0; %zerocenter input
lgraph = layerGraph(layers);
net = assembleNetwork(lgraph);
input_image = rand(input_size);
size(input_image) %Size before upsampling
predict(net,input_image);
act = activations(net, input_image, 'ups_3d');
size(act) %Size after upsampling
figure
subplot(3,2,1)
imshow(input_image(:,:,1,1),[])
title('Input(6x6)')
subplot(3,2,3)
imshow(input_image(:,:,2,1),[])
subplot(3,2,5)
imshow(input_image(:,:,3,1),[])
subplot(3,2,2)
imshow(act(:,:,2,1),[])
title('Upsampled(12x12)')
subplot(3,2,4)
imshow(act(:,:,4,1),[])
subplot(3,2,6)
imshow(act(:,:,6,1),[])
%Images on the same row should be the same, while the ones on the right are higher resolution.
2D case:
%2D toy example
input_size = [6,6,3];
num_filters = input_size(3);
upsample = transposedConv2dLayer([2,2],num_filters,'Stride',[2,2],...
'WeightLearnRateFactor',0,'BiasLearnRateFactor',0, 'Name', 'ups_2d');
upsample.Weights = zeros(2,2,num_filters,num_filters);
upsample.Bias = zeros(1,1,num_filters);
for j = 1:num_filters
upsample.Weights(:,:,j,j) = 1;
end
layers = [imageInputLayer(input_size, 'Name', 'input'),
upsample,
regressionLayer('Name', 'reg')];
layers(1).Mean = 0; %zerocenter input
lgraph = layerGraph(layers);
net = assembleNetwork(lgraph);
input_image = rand(input_size);
size(input_image) %Size before upsampling
predict(net,input_image);
act = activations(net, input_image, 'ups_2d');
size(act) %Size after upsampling
figure
subplot(3,2,1)
imshow(input_image(:,:,1),[])
title('Input(6x6)')
subplot(3,2,3)
imshow(input_image(:,:,2),[])
subplot(3,2,5)
imshow(input_image(:,:,3),[])
subplot(3,2,2)
imshow(act(:,:,1),[])
title('Upsampled(12x12)')
subplot(3,2,4)
imshow(act(:,:,2),[])
subplot(3,2,6)
imshow(act(:,:,3),[])
%Images on the same row should be the same, while the ones on the right are higher resolution.
I hope this helps.
0 commentaires
Voir également
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!