Evaluation metrics for deep learning model model

9 vues (au cours des 30 derniers jours)
Sushma TV
Sushma TV le 18 Nov 2021
Commenté : Pranjal Kaura le 26 Nov 2021
What is the command to be used for computing the evaluation metrics for a deep learning model such as precision, recall, specificity, F1 score.
Should it explicitly computed from the Confusion matrix by using the standard formulas or can it be directly computed in the code and displayed.
Also are these metrics computed on the Validation dataset.
Kindly provide inputs regarding the above.

Réponse acceptée

Pranjal Kaura
Pranjal Kaura le 23 Nov 2021
Modifié(e) : Pranjal Kaura le 23 Nov 2021
Hey Sushma,
Thank you for bringing this up. The concerned parties are looking at this issue and will try to roll it in future releases.
For now you can compute these metrics using the confusion matrix. You can refer to this link.
Hope this helps!
  2 commentaires
Sushma TV
Sushma TV le 25 Nov 2021
Thanks Pranjal. I went through the link that you sent but have a doubt in plotting Precision and Recall plots. Computation of values using Confusion matrix was possible but could not figure out the plots. What are the arguments of the function perfcurves to plot Precision- Recall curve?
Pranjal Kaura
Pranjal Kaura le 26 Nov 2021
'perfcurve' is used for plotting performance curves on classifier outputs. To plot a Precision-Recall curve you can set the 'XCrit' (Criterion to compute 'X') and YCrit to 'reca' and 'prec' respectively, to compute recall and precision. You can refer the following code snippet:
[X, Y] = perfcurve(labels, scores, posclass, 'XCrit', 'reca', 'YCrit', 'prec');

Connectez-vous pour commenter.

Plus de réponses (0)

Produits


Version

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by