Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

A confusion matrix is a table to evaluate the performance of a classification model. It allows visualization of the performance by comparing predicted labels with actual labels.

Info

Predicted represents the application results

Actual represents the user-defined correct results

In Pekat, most of the AI modules provide Confusion Matrix for the trained model (if that model was trained using Pekat version 3.18 or newer). The modules where the Confusion Matrix is available are:

However, the Confusion Matrix varies quite a lot between different models and here we will go through how it works in each of them.

There is also a Confusion Matrix available in the Report display, where you can consolidate the results of the evaluation flow and see how the overal flow is able to detect the defects. The Confusion Matrix from the report display will be explained first.

Report Confusion Matrix

confusion show.png
  • True positive (TP) - A user classified the image as

    Status
    colourGreen
    titleok
    and the application evaluated the image as
    Status
    colourGreen
    titleok
    .

  • False positive (FP) - A user classified the image as

    Status
    colourRed
    titleng
    but the application evaluated the image as
    Status
    colourGreen
    titleOK
    .

  • False negative (FN) - A user classified the image as

    Status
    colourGreen
    titleok
    but the application evaluated the image as
    Status
    colourRed
    titleng
    .

  • True negative (TN) - A user classified the image as

    Status
    colourRed
    titleng
    and the application evaluated the image as
    Status
    colourRed
    titleng
    .

The matrix shows how many images ended up in each of those categories.

Info

When you click on a specific cell of the confusion matrix, the images that ended up in that category will be filtered in the image list on the right.

Module confusion matrix

image-20241205-193445.png

The module matrix works the same way but can contain multiple different classes (as per the current classifier) and same as with the report matrix, you can click on a cell to filter corresponding images in the image list on the right.

The number in the matrix cell might not correspond to the number of filtered images, that’s because the numbers in the cells represent the number of annotations that correspond to that cell, so the number in the cell of the matrix will be higher than the number of images filtered if you have more than 1 annotation per image.

Again the table is representing the ground truth (user defined - called actual in report display) and predicted (application evaluation) results and shows the corresponding amounts.

Clicking on Export creates and downloads a .xlms file with the same values as shown in the table.

You can also add annotations in the LABELING section and then click on “RECALC MATRIX” button. This will recalculate the confidence matrix without needing to retrain the model. You can find more in the recalculating matrix section of this page.

Recalculating Matrix

If you want to better see how a trained model detects or classifies the defects, you can go back into labeling and annotate more images and press “RECALC STATISTICS“ button. This will update the confusion matrix with the data from the new annotations withou the need to retrain the model.

image-20241206-084214.pngimage-20241206-084833.pngImage Removedimage-20241206-084734.pngImage Removedimage-20241206-084734.pngImage Addedimage-20241206-084833.pngImage Added