Confusion Matrix

A confusion matrix is a table to evaluate the performance of a classification model. It allows visualization of the performance by comparing predicted labels with actual labels.

In Pekat there are two variations:

  • Overall evaluation confusion matrix

  • Classifier-specific confusion matrix

Predicted represents the application results

Actual represents the user-defined correct results

Evaluation confusion matrix

 

confusion show.png
TP=1; FP=1; FN=2; TN=1
  • True positive (TP) - A user classified the image as ok and the application evaluated the image as ok.

  • False positive (FP) - A user classified the image as ng but the application evaluated the image as OK.

  • False negative (FN) - A user classified the image as ok but the application evaluated the image as ng.

  • True negative (TN) - A user classified the image as ng and the application evaluated the image as ng.

The matrix shows how many images ended up in each of those categories.

When you click on a specific cell of the confusion matrix, the images that ended up in that category will be shown in the image list on the right.

Classifier confusion matrix

confusion matrix.png

This matrix works the same way but can contain multiple different classes (as per the current classifier). In the image above you can notice 4 classes: Screw, Nut, Spring, Pad.

Again the table is representing the actual (user defined) and predicted (application evaluation) results and shows the corresponding amounts.

Clicking on Export creates and downloads a .xlms file with the same values as shown in the table.