A confusion matrix is a table to evaluate the performance of a classification model. It allows visualization of the performance by comparing predicted labels with actual labels.
Info |
---|
Predicted represents the application results Actual represents the user-defined correct results |
In Pekat, most of the AI modules provide Confusion Matrix for the trained model (if that model was trained using Pekat version 3.18 or newer). The modules where the Confusion Matrix is available are:
However, the Confusion Matrix varies quite a lot between different models and here we will go through how it works in each of them.
There is also a Confusion Matrix available in the Report display, where you can consolidate the results of the evaluation flow and see how the overal flow is able to detect the defects. The Confusion Matrix from the report display will be explained first.
Report Confusion Matrix
True positive (TP) - A user classified the image as
and the application evaluated the image asStatus colour Green title ok
.Status colour Green title ok False positive (FP) - A user classified the image as
but the application evaluated the image asStatus colour Red title ng
.Status colour Green title OK False negative (FN) - A user classified the image as
but the application evaluated the image asStatus colour Green title ok
.Status colour Red title ng True negative (TN) - A user classified the image as
and the application evaluated the image asStatus colour Red title ng
.Status colour Red title ng
The matrix shows how many images ended up in each of those categories.
Info |
---|
When you click on a specific cell of the confusion matrix, the images that ended up in that category will be filtered in the image list on the right. |
Module confusion matrix
The module matrix works the same way but can contain multiple different classes (as per the current classifier) and same as with the report matrix, you can click on a cell to filter corresponding images in the image list on the right.
The number in the matrix cell might not correspond to the number of filtered images, that’s because the numbers in the cells represent the number of annotations that correspond to that cell, so the number in the cell of the matrix will be higher than the number of images filtered if you have more than 1 annotation per image.
Again the table is representing the ground truth (user defined - called actual in report display) and predicted (application evaluation) results and shows the corresponding amounts.
Clicking on Export creates and downloads a .xlms file with the same values as shown in the table.
You can also add annotations in the LABELING section and then click on “RECALC MATRIX” button. This will recalculate the confidence matrix without needing to retrain the model. You can find more in the recalculating matrix section of this page.
Recalculating Matrix
If you want to better see how a trained model detects or classifies the defects, you can go back into labeling and annotate more images and press “RECALC STATISTICS“ button. This will update the confusion matrix with the data from the new annotations withou the need to retrain the model.