/
Confusion Matrix

Confusion Matrix

A confusion matrix is a table to evaluate the performance of a classification model. It allows visualization of the performance by comparing predicted labels with actual labels.

Predicted represents the application results

Actual represents the user-defined correct results

In Pekat, most of the AI modules provide Confusion Matrix for the trained model (if that model was trained using Pekat version 3.18 or newer). The modules where the Confusion Matrix is available are:

However, the Confusion Matrix varies quite a lot between different models and here we will go through how it works in each of them.

There is also a Confusion Matrix available in the Report display, where you can consolidate the results of the evaluation flow and see how the overal flow is able to detect the defects. The Confusion Matrix from the report display will be explained first.

Report Confusion Matrix

confusion show.png
TP=1; FP=1; FN=2; TN=1
  • True positive (TP) - A user classified the image as ok and the application evaluated the image as ok.

  • False positive (FP) - A user classified the image as ng but the application evaluated the image as OK.

  • False negative (FN) - A user classified the image as ok but the application evaluated the image as ng.

  • True negative (TN) - A user classified the image as ng and the application evaluated the image as ng.

The matrix shows how many images ended up in each of those categories.

When you click on a specific cell of the confusion matrix, the images that ended up in that category will be filtered in the image list on the right.

Module confusion matrix

image-20241205-193445.png

The module matrix works the same way but can contain multiple different classes (as per the current classifier) and same as with the report matrix, you can click on a cell to filter corresponding images in the image list on the right.

The number in the matrix cell might not correspond to the number of filtered images, that’s because the numbers in the cells represent the number of annotations that correspond to that cell, so the number in the cell of the matrix will be higher than the number of images filtered if you have more than 1 annotation per image.

Again the table is representing the ground truth (user defined - called actual in report display) and predicted (application evaluation) results and shows the corresponding amounts.

Clicking on Export creates and downloads a .xlsm file with the same values as shown in the table.

You can also add annotations in the LABELING section and then click on “RECALC MATRIX” button. This will recalculate the confidence matrix without needing to retrain the model. You can find more in the recalculating matrix section of this page.

This page will explain the differences in how confusion matrix works in each module where it is available.

Recalculating Matrix

If you want to see better how a trained model detects or classifies the defects, you can go back into labeling and annotate more images and press “RECALC STATISTICS“ button. This will update the confusion matrix with the data from the new annotations without the need to retrain the model.

 

 

Anomaly Detector confusion matrix

In anomaly detector, the confusion matrix is directly visible without the need to click on any additional buttons:

Since Anomaly only splits images between OK and NG, the confusion matrix only has those two classes.

Classifier confusion matrix

The classifier confusion matrix can be accessed by clicking on the “CONFUSION MATRIX“ button:

The classifier confusion matrix looks like this and works exactly as was described in module confusion matrix section.

Detector confusion matrix

The detector confusion matrix works much like the classifier confusion matrix but it has one extra misdetections - when the model detects defect when there is none in annotations. If the classification is turned off, the confusion matrix is like matrix with one unnamed class.

The detection is considered correct (it corresponds to one of the green cells in confusion matrix) if at least half of the detected rectangle is overlapping with annotated rectangle (and if you have classification turned on, the rectangles must also be of the same class as the annotation to be considered correct).

You can compare the annotated rectangles with the detected ones in the Inspection tab by having the Show Annotations button ticked on (or in the Labeling tab by switching the Active Annotations on).

For more details on how the confusion matrix works you can read the module confusion matrix section.

Surface Detector confusion matrix

Again, the Surface Detector confusion matrix can be accessed by clicking on the “CONFUSION MATRIX“ button:

It looks like this and works just like detector confusion matrix, except that surface detector doesn’t provide the “RECALC MATRIX” function in the Labeling tab:

OCR confusion matrix

OCR confusion matrix is very different from the rest of the modules. First, it’s accessed by clicking on the “MODEL STATISTICS” button:

Second, it is more of a table than a matrix, with each row being for one type of annotated text and it has three columns:

  • Detection column - shows the text that is supposed to be detected (annotated by user)

  • OK column - shows how many times was the text detected correctly

  • NG column - shows how many times the text wasn’t detected correctly

 

Related content

Report
More like this
Model Overview
Model Overview
Read with this
Confusion Matrix
Confusion Matrix
More like this
Simple TCP communications
Simple TCP communications
Read with this
Confusion Matrix
Confusion Matrix
More like this
Anomaly Detector
Anomaly Detector
Read with this