Use of different metrics for Classification Toolbox

5 views (last 30 days)
I am using classification toolbox. I have multiple alternatives for the model, but they are only based on accuracy. I have access to the confusion matrix so it seems that I have all data for "precision" and "recall". Is there a way to have either directly having those metrics or getting data from cunfiion matrix for me to calculate myself?
Thanks in advance for your help.
  2 Comments
Image Analyst
Image Analyst on 2 Mar 2025
Not sure where you got the classification toolbox. Who wrote it? Do you mean the Classification Learner app on the Apps tab of the tool ribbon. THe Classification Learner app requires the Statistics and Machine Learning Toolbox. Is that what you mean? There is a confusion matrix function: confusionmat
Fatih
Fatih on 6 Mar 2025
Sorry my mistake. You are right, I mean the app. I can retrieve the accuracy, confusion matrix in the test tab, but I don't have the metrics such as recall or precision.

Sign in to comment.

Accepted Answer

Drew
Drew on 3 Mar 2025
Edited: Drew on 3 Mar 2025
There are many ways to obtain the precision and recall scores when working with classification and the Statistics and Machine Learning toolbox. Consider also whether you are interested in per-class precision and recall, or one of the types of average precision and recall.
Two approaches:
(1) Use the Classification Learner app
(2) If using the command line, one can use the confusionchart or rocmetrics functions.
(1) Using the Classification Learner app:
In 24b or higher, the classificationLearner app includes Precision, Recall, and F1 Score metrics calculated on the validation data in the "Additional Training Results" section in the Model Summary. Here is an example for the default tree model built on fisher iris data.
The per-class Recall (also called True Positive Rate or "TPR") can be seen in the confusion matrix view in the app, where it is shown in the TPR summary column on the right:
The per-class Precision (also called Positive Predictive Value, or "PPV") can be seen in the confusion matrix view in the app, where it is shown in the PPV summary row on the bottom:
The classificationLearner app includes a Precision-Recall plot. The rocmetrics object, whicih containts this info, can be exported from the app using "Export Plot Data".
In the classificationLearner app Results Table, additional metrics can be added, including various types of average precision, recall, and F1 score. This table and associated visualizations (See "Compare Results" plot) can be used to compare across models.
If you would like to directly output the confusion matrix from the app, use the "Export Plot Data" option while looking at the confusion matrix. You can then directly calculate additional metrics yourself from the confusion matrix, or use confusionchart get a visual view with summaries.
(2) Using the command line:
If using the command line, one can use the confusionchart or rocmetrics functions. If only interested in the model operating point, then one can use confusionchart operating on the confusion matrix. See the images above to see where the per-class Recall=TPR and Precision=PPV appear in the confusionchart with column-normalized and row-normalized summaries:
confusionchart(ConfusionMatrix, ClassLabels,ColumnSummary='column-normalized',RowSummary='row-normalized')
50
The rocmetrics function can be used to get precision, recall, and F1 information across a range of operating points.
If this answer helps you, please remember to accept the answer.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!