Neural Network Softmax Layer Activating as All 1s.

2 vues (au cours des 30 derniers jours)
Sebastian Detering
Sebastian Detering le 1 Août 2021
Context:
Using assembleNetwork() I have converted a layerGraph into a DAGNetwork.
This network is a unet with 512x512x1 input and 512x512x1 output. The output should be two classes something like 'main' and 'background'.
analyzeNetwork( assembledNetDAGform ) works as expected, and it correctly identifies the output size as 512x512x1.
Problem:
When testing the network a 512x512 all 1s categorical is the output.
The second to last layer is also all 1s. This is the softmax layer.
Luckily the layer before seems to activate correctly and is not all 1s, but rather the expected activations.
Question:
Is there anything I should know about softmax that needs a specific input?
One thing that baffles me is that the softmax function is supposed to balanced the entire output to add up to a probability of 1. So having all 1s makes no sense for an output of softmax.
Any more details about how i should approach this is helpful.
The activations of the layer before softmax (maxpooling2d) and softmax I debugged with this code.
>> poolactive = activations(assembledNetDAGform, II, 'Sig35MaxPool2d'); imshow(poolactive);
>> softmaxactivations = activations(assembledNetDAGform, II, 'PixelClassificationSig35Softmax'); imshow(softmaxactivations);
Which gave these images (the second one is all 1s clearly).
Here's the end of the layer graph for more info:
The error I get is not really the problem ( I wanted to show the output as an image, but imshow() didn't expect my output which was a categorical). I don't really care about this error now, I simply want to understand why the softmax layer is failing, then I can figure out how to translate categorical outputs to bitmap images.
Error using imageDisplayValidateParams
Expected input number 1, I, to be one of these types:
double, single, uint8, uint16, uint32, uint64, int8, int16, int32, int64, logical
Instead its type was categorical.
Error in images.internal.imageDisplayValidateParams (line 11)
validateattributes(common_args.CData, {'numeric','logical'},...
Error in images.internal.imageDisplayParseInputs (line 79)
common_args = images.internal.imageDisplayValidateParams(common_args);
Error in imshow (line 253)
images.internal.imageDisplayParseInputs({'Parent','Border','Reduce'},preparsed_varargin{:});
Error in fullGuassianONNXtoMATLAB (line 60)
imshow(out_img)

Réponses (1)

Sebastian Detering
Sebastian Detering le 12 Août 2021
It turns out, the last two layers were automatically added by MatLab during the import process and that they were unrelated to the original model, and therefore were not activating correctly.

Produits


Version

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by