error when trying to generate confusion matrix from a trained convolutional neural network

2 vues (au cours des 30 derniers jours)
i keep getting this error but not sure why
{Error using == (line 25)
Invalid types for comparison.
Error in confusion (line 45)
if ~all((targets==0) | (targets==1) | isnan(targets))
Error in cnn11 (line 89)
[c,cm,ind,per] = confusion(TTest,YTest)
}
I am using a cnn in the neural net toolbox and trying to generate a confusion matrix for it this is my code ( nfoldTest1 is my testing set) convnet = trainNetwork(Train1,layers,options) save convnet
YTest = classify(convnet,nfoldTest1;
TTest = nfoldTest1.Labels;
accuracy = sum(YTest == TTest)/numel(TTest)
[c,cm,ind,per] = confusion(TTest,YTest)

Réponse acceptée

Kris Fedorenko
Kris Fedorenko le 10 Août 2017
Hi Pierre!
I think you might want to use " confusionmat " instead, where the arguments are grouping variables with the same number of observations, where each element of the vector denotes the class that observation belongs to. This should correspond to the format of your TTest and YTest.
In contrast, " confusion " requires that the arguments are in the following form:
  • targets are a (number of classes)-by-(number of observations) array filled with zeros and ones where each column/observation denotes its class membership by having 1 in the corresponding row and zeros elsewhere.
  • outputs are also a (number of classes)-by-(number of observations), but values can be from the range [0,1] where the bigger value in each column is in the row to which that observation is predicted to belong.
For example, if you had four observations with the following classes [1 1 2 3] and only three classes total, your targets array would look like this:
1 1 0 0
0 0 1 0
0 0 0 1
Hope this helps!
Kris
  2 commentaires
Pierre Pook
Pierre Pook le 10 Août 2017
Hi Kris!
Thank you for your answer and help. in confusionMat and from online source and the transfer learning example i found this code
predictedLabels = predict(trainedNet, testData);
testLabels = testData.Labels;
confMat = confusionmat(testLabels, predictedLabels);
% Convert confusion matrix into percentage form
confMat = bsxfun(@rdivide,confMat,sum(confMat,2))
% Display the mean accuracy
mean(diag(conflate))
i was just checking logically would this code work ? I am also trying to edit the code for 5 fold cross validation. If i trained my network 5 times could i add all the predictedLabels from each fold into a variable and test labels from each fold into another variable and then give them as arguments to the confusionMat ? or would that give me the wrong answer logically ?? Apologies for the inconvenience caused and thanks for your help!
Pierre
Kris Fedorenko
Kris Fedorenko le 11 Août 2017
Hi Pierre!
Let's examine the code sample that you posted.
predictedLabels = predict(trainedNet, testData);
Here I assume that "trainedNet" was created using trainNetwork and that "testData" is akin your "nfoldTest1" from your code before (i.e. it is of MATLAB ImageDatastore class). According to predict 's documentation, it outputs estimation of a probability that each observation belongs to a particular class. For example, if I had two observations/images in "testData" and 5 possible classes, I might see something like this:
0.9987 0.0000 0.0013 0.0000 0.0001
0.0000 0.7721 0.2262 0.0007 0.0010
which would mean that the network is pretty certain that first observation belongs in class 1, but it is less sure about observation 2. Most likely (~77%) second observation is from class 2.
testLabels = testData.Labels;
If "testData" is an ImageDatastore object, we can see from the documentation that it does in fact has a "Labels" property that contains one label for each observation/image specified as a vector or cell array of label names. So if you had 2 observations/images in your "testData" and 5 possible classes, "testLabels" might look like this:
1
4
This would mean that first observation belongs to Class 1, and second observation belongs to Class 4.
confMat = confusionmat(testLabels, predictedLabels);
Now here we will run into problems. As I explained in my first post, we should have "testLabels" and "predictedLabels" in the same format. Using the same example:
  • if we want to use confusion(testLabels, predictedLabels) , we need "testLabels" as
1 0
0 0
0 0
0 1
0 0
and "predictedLabels" as
0.9987 0
0 0.7721
0.0013 0.2262
0 0.0007
0.0001 0.0010
  • if we want to use confusionmat(testLabels, predictedLabels) , we need "testLabels" as
1
4
and "predictedLabels" as
1
2
According to the documentation, you can get "predictedLabels" in this form if you use classify instead of predict.
Assume we figured out which way we want to go and obtained a confusion matrix (stored in variable "confMat"). Using the same example it would look like this:
1 0 0 0 0
0 0 0 0 0
0 0 0 0 0
0 1 0 0 0
0 0 0 0 0
i.e. we have one observation that was actually class 1 and was predicted as class 1, and one observation that was actually class 4. but was predicted as class 2.
Now, let's see what happens next.
confMat = bsxfun(@rdivide,confMat,sum(confMat,2))
This part, sum(confMat,2), sums the rows of our confusion matrix. For our example it would produce
1
0
0
1
0
It is the number of actual observations in each class (in "testData"). Now looking at rdivide and bsxfun, we see that we are effectively dividing each element of confMat by the total number of observations in the corresponding actual class. Intuitively this is which proportion of each class was (miss)classified to what class. In our example, we have zero observations in class 2, 3, or 5, so we would be dividing by 0 in those cases.
mean(diag(conflate))
I don't see where "conflate" variable is defined, but I am sure if you refer back to the original example and to the documentation for diag and mean, you would be able to see what this line does.
In my experience, this is the best way to understand any example code - execute it line by line and examine the outputs and the workspace variables, referring to the documentation for the functions as you go.
Regarding the confusion matrix for multiple folds, computing confusion matrix from predicted labels and actual labels for each fold should be fine. It is up to you how/if you want to aggregate your confusion matrices across folds (average count? average percentage? sum of counts? there are different ways to do it).
Hope this helps!
Kris

Connectez-vous pour commenter.

Plus de réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by