Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

something somethingA confusion matrix is a table to evaluate the performance of a classification model. It allows visualization of the performance by comparing predicted labels with actual labels.

In Pekat there are two variations:

  • Overall evaluation confusion matrix

  • Classifier-specific confusion matrix

Info

Predicted represents the application results

Actual represents the user-defined correct results

Evaluation confusion matrix

confusion show.pngImage Added

  • True positive (TP) - A user classified the image as

    Status
    colourGreen
    titleok
    and the application evaluated the image as
    Status
    colourGreen
    titleok
    .

  • False positive (FP) - A user classified the image as

    Status
    colourRed
    titleng
    but the application evaluated the image as
    Status
    colourGreen
    titleOK
    .

  • False negative (FN) - A user classified the image as

    Status
    colourGreen
    titleok
    but the application evaluated the image as
    Status
    colourRed
    titleng
    .

  • True negative (TN) - A user classified the image as

    Status
    colourRed
    titleng
    and the application evaluated the image as
    Status
    colourRed
    titleng
    .

The matrix shows how many images ended up in each of those categories.

Info

When you click on a specific cell of the confusion matrix, the images that ended up in that category will be shown in the image list on the right.

Classifier confusion matrix

...

This matrix works the same way but can contain multiple different classes (as per the current classifier). In the image above you can notice 4 classes: Screw, Nut, Spring, Pad.

Again the table is representing the actual (user defined) and predicted (application evaluation) results and shows the corresponding amounts.

Clicking on Export creates and downloads a .xlms file with the same values as shown in the table.