MATLAB Answers

0

Regarding Multi-label transfer learning with googlenet

Asked by Balakrishnan Rajan on 22 Aug 2018
Latest activity Commented on by SC P
on 12 Oct 2019
I have a dataset with pictures with presence of objects of different classes. I want to perform a multilabel classification, which means I need to classify the pictures into different classes with the picture belonging to more than one class at the same time. That is, for pictures with objects of type A and type B, the net should output both the labels A and B.
If I am designing a CNN for this from scratch, I will have a sigmoid activation at the last layer. The number of output neurons will be equal to the number of classes with the output of each neuron giving 1 if the picture belongs to the particular class or 0 if not. However, there seems to be no provision for adding a sigmoid function and the Image datastore cannot hold binary vectors as the label. How do I overcome this?

  1 Comment

@Balakrishnan Rajan ,how you have resolved this problem? ( how you did this?:defining classes which are unique combination of the previous class occurences). Is there any code of it

Sign in to comment.

4 Answers

Answer by Shounak Mitra on 24 Aug 2018
 Accepted Answer

We do not support sigmoid activation. You can use the softmax activation function. You don't need to define the neurons in the softmaxLayer. Define the no of neurons (= no of classes) you want in the fullyConnectedLayer. So, your network structure would be like:
inputLayer -- -- fullyConnectedLayer softmaxLayer ClassificationLayer
HTH Shounak

  1 Comment

but softmaxLayer make the sum of all probabilities to be 1 (and therefere mutually exclusive classes), with a "logsigLayer", every class could take probability 1 (and therefore not mutually exclusive classes). Is there any workaround to this?

Sign in to comment.


Answer by Greg Heath
on 22 Dec 2018

Decades old solution:
Divide each output by the sum to obtain the relative probability of each class
Hope this helps.
Thank you for formally accepting my answer
Greg

  3 Comments

but the sum is 1, so ...
I had resolved it by defining classes which are unique combination of the previous class occurences. That way now they are all mutually exclusive.
To Kira:
My point was:
If you do not use softmax, the sum is not constrained to be 1 !
Greg

Sign in to comment.


Answer by cui
on 14 May 2019

Can I define multiple softmaxLayer at the end of the network? Each softmaxLayer is independent of each other, and each layer is used to classify a label so that there can be multiple loss functions, shared by the previous convolutional layer? But how do you enter the network goals?

  0 Comments

Sign in to comment.


Answer by Antonio Quvera on 21 May 2019
Edited by Antonio Quvera on 21 May 2019

I'm also interested in this application (i.e. multi-label classification using CNN/LSTM). Any news? Does the latest deep learning toolbox resolve this issue?

  0 Comments

Sign in to comment.