Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

A confusion matrix is a table to evaluate the performance of a classification model. It allows visualization of the performance by comparing predicted labels with actual labels.

In Pekat there are two variations:

  • Overall evaluation confusion matrix

  • Classifier-specific confusion matrix

Info

Predicted represents the application results

Actual represents the user-defined correct results

...

In Pekat, most of the AI modules provide Confusion Matrix for the trained model (if that model was trained using Pekat version 3.18 or newer). The modules where the Confusion Matrix is available are:

However, the Confusion Matrix varies quite a lot between different models and here we will go through how it works in each of them.

There is also a Confusion Matrix available in the Report display, where you can consolidate the results of the evaluation flow and see how the overal flow is able to detect the defects. The Confusion Matrix from the report display will be explained first.

Report Confusion Matrix

...

  • True positive (TP) - A user classified the image as

    Status
    colourGreen
    titleok
    and the application evaluated the image as
    Status
    colourGreen
    titleok
    .

  • False positive (FP) - A user classified the image as

    Status
    colourRed
    titleng
    but the application evaluated the image as
    Status
    colourGreen
    titleOK
    .

  • False negative (FN) - A user classified the image as

    Status
    colourGreen
    titleok
    but the application evaluated the image as
    Status
    colourRed
    titleng
    .

  • True negative (TN) - A user classified the image as

    Status
    colourRed
    titleng
    and the application evaluated the image as
    Status
    colourRed
    titleng
    .

...

Info

When you click on a specific cell of the confusion matrix, the images that ended up in that category will be shown filtered in the image list on the right.

...

Module confusion matrix

...

This The module matrix works the same way but can contain multiple different classes (as per the current classifier) , however, if you and same as with the report matrix, you can click on a cell , it won’t to filter the corresponding images belonging there to in the image list on the right. In the image above you can notice 4 classes: Screw, Nut, Spring, Pad

The number in the matrix cell might not correspond to the number of filtered images, that’s because the numbers in the cells represent the number of annotations that correspond to that cell, so the number in the cell of the matrix will be higher than the number of images filtered if you have more than 1 annotation per image.

Again the table is representing the actual ground truth (user defined - called actual in report display) and predicted (application evaluation) results and shows the corresponding amounts.

Clicking on Export creates and downloads a .xlms xlsm file with the same values as shown in the table.

You can also add annotations in the LABELING section and then click on “RECALC MATRIX” button. This will recalculate the confidence matrix without needing to retrain the model. You can find more in the recalculating matrix section of this page.

This page will explain the differences in how confusion matrix works in each module where it is available.

Recalculating Matrix

If you want to see better how a trained model detects or classifies the defects, you can go back into labeling and annotate more images and press “RECALC STATISTICS“ button. This will update the confusion matrix with the data from the new annotations without the need to retrain the model.

...

Note

The “RECALC MATRIX“ function isn’t available in the Surface Detection and OCR modules.

Anomaly Detector confusion matrix

In anomaly detector, the confusion matrix is directly visible without the need to click on any additional buttons:

...

Since Anomaly only splits images between OK and NG, the confusion matrix only has those two classes.

Info

In the Anomaly Detector module, the “RECALC STATISTICS” button updates not only the confusion matrix, but also the threshold value.

Classifier confusion matrix

The classifier confusion matrix can be accessed by clicking on the “CONFUSION MATRIX“ button:

...

The classifier confusion matrix looks like this and works exactly as was described in module confusion matrix section.

...

Detector confusion matrix

The detector confusion matrix works much like the classifier confusion matrix but it has one extra misdetections - when the model detects defect when there is none in annotations. If the classification is turned off, the confusion matrix is like matrix with one unnamed class.

...

The detection is considered correct (it corresponds to one of the green cells in confusion matrix) if at least half of the detected rectangle is overlapping with annotated rectangle (and if you have classification turned on, the rectangles must also be of the same class as the annotation to be considered correct).

You can compare the annotated rectangles with the detected ones in the Inspection tab by having the Show Annotations button ticked on (or in the Labeling tab by switching the Active Annotations on).

...

For more details on how the confusion matrix works you can read the module confusion matrix section.

Surface Detector confusion matrix

Again, the Surface Detector confusion matrix can be accessed by clicking on the “CONFUSION MATRIX“ button:

...

It looks like this and works just like detector confusion matrix, except that surface detector doesn’t provide the “RECALC MATRIX” function in the Labeling tab:

...

OCR confusion matrix

OCR confusion matrix is very different from the rest of the modules. First, it’s accessed by clicking on the “MODEL STATISTICS” button:

...

Second, it is more of a table than a matrix, with each row being for one type of annotated text and it has three columns:

  • Detection column - shows the text that is supposed to be detected (annotated by user)

  • OK column - shows how many times was the text detected correctly

  • NG column - shows how many times the text wasn’t detected correctly

Info

By clicking on a number under either the OK or the NG column, the images with those annotation will be filtered in the image list on the right side of the screen

Note

OCR modul doesn’t have the recalculate matrix functionality

...