Creates new instance of confusion matrix.
Confusion matrix labels
Confusion matrix values.
The columns represents the true classes and the columns the predicted classes.
Normalization history values.
Accuracy gives the fraction of total predictions which were correctly classified.
Formula:
accuracy = (TP + TN) / (TP + TN + FP + FN)
Set of configurations used on accuracy calculations.
[[configuration.label]] : The label name which will be used to calculate the accuracy value. If undefined or null, the accuracy value will be calculated for all confusion matrix.
[[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the accuracy formula.
[[configuration.average.Macro]]: Calculates and sums the accuracy for each individual label and divides for the number of labels.
[[configuration.average.Weighted]]: Defines if the accuracy calculations should be weighted. This means the labels with more predictions will weight more in the final accuracy value comparing with labels with less. predictions.
The accuracy value.
Applies the F1 Score formula
The precision value.
The recall value.
Deep clones and return the current confusion matrix.
The deep cloned confusion matrix object.
Deep copies a given object.
The object to be deep cloned.
The deep cloned object.
F1 Score is the harmonic mean of precision and recall.
Formula:
f1Score = TP / (TN + FN)
Set of configurations used on F1 Score calculations.
[[configuration.label]] : The label name which will be used to calculate the F1 Score value. If undefined or null, the value will be calculated for all confusion matrix.
[[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the F1 Score formula.
[[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.
[[configuration.average.Weighted]]: Defines if the F1 Score calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.
The F1 Score value.
Get all matrix classes, containing information about true positives, true negatives, false positives and false negatives, as well as the label associated with it.
An array of matrix classes containing information about true positives, true negatives, false positives and false negatives, as well as the label associated with it.
For one given label, returns the matrix classes (true positives, true negatives, false positives and false negatives).
The matrix classes (true positives, true negatives, false positives and false negatives).
Returns the sum of predictions for all labels.
Array with the sum of predictions for all labels. The position of the array corresponds to the label position on the confusion matrix.
Gets the confusion matrix min and max value.
Min and max confusion matrix value or null if not exists.
Gets the total of predictions (samples) in a given label or on all confusion matrix.
Number of prediction for the given label. If null, undefined or empty, return the number of predictions for all confusion matrix.
Gets the sum of all confusion matrix predictions, defined by classes.
Sum of all predictions, defined by classes.
Gives the accuracy value for a given matrix label.
Accuracy gives the fraction of total predictions which were correctly classified.
Formula:
accuracy = (TP + TN) / (TP + TN + FP + FN)
The label used to get the accuracy value.
Accuracy value for a given label.
Gives the F1 Score value for a given matrix label.
F1 Score is the harmonic mean of precision and recall.
Formula:
f1Score = TP / (TN + FN)
The label used to get theF 1 Score value.
F1 Score value for a given label.
Gives the miss classification rate for a given matrix label.
Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.
Formula:
missClassification = (FP + FN) / (TP + TN + FP + FN)
The label used to get the miss classification rate value.
Miss classification rate for a given label.
Gives the precision value for a given matrix label.
Precision, stands for what fraction of predictions as a positive class were actual positive.
Formula:
precision = (TP) / (TP + FP)
The label used to get the precision value.
Precision value for a given label.
Gives the recall value for a given matrix label.
Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.
Formula:
recall = TP / (TP + FN)
The label used to get the recall value.
Recall value for a given label.
Gives the specificity value for a given matrix label.
Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.
Formula:
specificity = TP / (TN + FN)
The label used to get the specificity value.
Specificity value for a given label.
Gives the accuracy value for all confusion matrix, taking in account the macro average method.
Accuracy gives the fraction of total predictions which were correctly classified.
The macro average method calculates and sums the accuracy for each individual label and divides for the number of labels.
The macro accuracy value.
Gives the F1 Score value for all confusion matrix, taking in account the macro average method.
F1 Score is the harmonic mean of precision and recall.
The macro average method calculates and sums the F1 Score for each individual label and divides for the number of labels.
The macro F1 Score value.
Gives the miss classification value for all confusion matrix, taking in account the macro average method.
Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.
The macro average method calculates and sums the miss classification for each individual label and divides for the number of labels.
The macro miss classification value.
Gives the precision value for all confusion matrix, taking in account the macro average method.
Precision, gives what fraction of predictions as a positive class were actual positive.
The macro average method calculates and sums the precision for each individual label and divides for the number of labels.
The macro precision value.
Gives the recall value for all confusion matrix, taking in account the macro average method.
Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.
The macro average method calculates and sums the recall for each individual label and divides for the number of labels.
The macro recall value.
Gives the specificity value for all confusion matrix, taking in account the macro average method.
Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.
The macro average method calculates and sums the specificity for each individual label and divides for the number of labels.
The macro specificity value.
Gives the accuracy value for all confusion matrix, taking in account a given average method of calculation.
Accuracy gives the fraction of total predictions which were correctly classified.
Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the accuracy formula.
[[average.Macro]]: Calculates and sums the accuracy for each individual label and divides for the number of labels.
[[average.Weighted]]: Defines if the accuracy calculations should be weighted. This means the labels with more predictions will weight more in the final accuracy value comparing with labels with less. predictions.
The accuracy value.
Gives the F1 Score value for all confusion matrix, taking in account a given average method of calculation.
F1 Score is the harmonic mean of precision and recall.
Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the F1 Score formula.
[[average.Macro]]: Calculates and sums the F1 Score for each individual label and divides for the number of labels.
[[average.Weighted]]: Defines if the F1 Score calculations should be weighted. This means the labels with more predictions will weight more in the final F1 Score value comparing with labels with less. predictions.
The F1 Score value.
Gives the miss classification rate value for all confusion matrix, taking in account a given average method of calculation.
Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.
Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the miss classification formula.
[[average.Macro]]: Calculates and sums the miss classification for each individual label and divides for the number of labels.
[[average.Weighted]]: Defines if the miss classification calculations should be weighted. This means the labels with more predictions will weight more in the final miss classification value comparing with labels with less. predictions.
The miss classification value.
Gives the precision value for all confusion matrix, taking in account a given average method of calculation.
Precision, stands for what fraction of predictions as a positive class were actual positive.
Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the precision formula.
[[average.Macro]]: Calculates and sums the precision for each individual label and divides for the number of labels.
[[average.Weighted]]: Defines if the precision calculations should be weighted. This means the labels with more predictions will weight more in the final precision value comparing with labels with less. predictions.
The precision value.
Gives the recall value for all confusion matrix, taking in account a given average method of calculation.
Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.
Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the recall formula.
[[average.Macro]]: Calculates and sums the recall for each individual label and divides for the number of labels.
[[average.Weighted]]: Defines if the recall calculations should be weighted. This means the labels with more predictions will weight more in the final recall value comparing with labels with less. predictions.
The recall value.
Gives the specificity value for all confusion matrix, taking in account a given average method of calculation.
Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.
Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the specificity formula.
[[average.Macro]]: Calculates and sums the specificity for each individual label and divides for the number of labels.
[[average.Weighted]]: Defines if the specificity calculations should be weighted. This means the labels with more predictions will weight more in the final specificity value comparing with labels with less. predictions.
The specificity value.
Gives the accuracy value for all confusion matrix, taking in account the micro average method.
Accuracy gives the fraction of total predictions which were correctly classified.
The micro average method calculates and sums the accuracy for each individual label and divides for the number of labels.
The micro accuracy value.
Gives the F1 Score value for all confusion matrix, taking in account the micro average method.
F1 Score is the harmonic mean of precision and recall.
The micro average method calculates and sums the F1 Score for each individual label and divides for the number of labels.
The micro F1 Score value.
Gives the miss classification value for all confusion matrix, taking in account the micro average method.
Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.
The micro average method calculates and sums the miss classification for each individual label and divides for the number of labels.
The micro miss classification value.
Gives the precision value for all confusion matrix, taking in account the micro average method.
Precision, gives what fraction of predictions as a positive class were actual positive.
The micro average method calculates and sums the precision for each individual label and divides for the number of labels.
The micro precision value.
Gives the recall value for all confusion matrix, taking in account the micro average method.
Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.
The micro average method calculates and sums the recall for each individual label and divides for the number of labels.
The micro recall value.
Gives the specificity value for all confusion matrix, taking in account the micro average method.
Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.
The micro average method calculates and sums the specificity for each individual label and divides for the number of labels.
The micro specificity value.
Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.
Formula:
accuracy = (FP + FN) / (TP + TN + FP + FN)
Set of configurations used on miss classification rate calculations.
[[configuration.label]] : The label name which will be used to calculate the miss classification rate. If undefined or null, the value will be calculated for all confusion matrix.
[[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the miss classification formula.
[[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.
[[configuration.average.Weighted]]: Defines if the miss classification calculations should be weighted. This means the labels with more predictions will weight more in the final rate value comparing with labels with less predictions.
The miss classification rate value.
Normalizes all values of the matrix between two given values.
All normalizations will be saved in history and it is possible to revert last normalizations by calling the function @see ConfusionMatrix.revertNormalization.
Minimum value of the normalized range values [min, max].
Maximum value of the normalized range values [min, max].
— Number of digits after the decimal point. Must be in the range 0 - 20, inclusive.
Precision, gives what fraction of predictions as a positive class were actual positive.
Formula:
precision = (TP) / (TP + FP)
Set of configurations used on precision calculations.
[[configuration.label]] : The label name which will be used to calculate the precision value. If undefined or null, the value will be calculated for all confusion matrix.
[[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the precision formula.
[[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.
[[configuration.average.Weighted]]: Defines if the precision calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.
The precision value.
Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.
Formula:
recall = TP / (TP + FN)
Set of configurations used on recall calculations.
[[configuration.label]] : The label name which will be used to calculate the recall value. If undefined or null, the value will be calculated for all confusion matrix.
[[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the recall formula.
[[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.
[[configuration.average.Weighted]]: Defines if the recall calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.
The recall value.
Reverts all normalizations performed.
Reverts the last normalization occur getting the current confusion matrix.
The confusion matrix object before the normalization. If there is not any entry on the history, null will be returned.
Sets the confusion matrix value based on another confusion matrix.
The confusion matrix.
Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.
Formula:
specificity = TP / (TN + FN)
Set of configurations used on specificity calculations.
[[configuration.label]] : The label name which will be used to calculate the specificity value. If undefined or null, the value will be calculated for all confusion matrix.
[[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).
[[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the specificity formula.
[[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.
[[configuration.average.Weighted]]: Defines if the specificity calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.
The specificity value.
Change the rows for the columns and vice-versa.
Validate if the confusion matrix is valid. If not, an error describing the issue will be thrown.
Gives the accuracy value for all confusion matrix, taking in account the average weighted method.
Accuracy gives the fraction of total predictions which were correctly classified.
The weighted average method gives the labels with more predictions more importance (weight) in the final accuracy value comparing with labels with less predictions.
The weighted accuracy value.
Gives the F1 Score value for all confusion matrix, taking in account the average weighted method.
F1 Score is the harmonic mean of precision and recall.
The weighted average method gives the labels with more predictions more importance (weight) in the final F1 Score value comparing with labels with less predictions.
The weighted F1 Score value.
Gives the miss classification value for all confusion matrix, taking in account the average weighted method.
Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.
The weighted average method gives the labels with more predictions more importance (weight) in the final miss classification value comparing with labels with less predictions.
The weighted miss classification value.
Gives the precision value for all confusion matrix, taking in account the average weighted method.
Precision, gives what fraction of predictions as a positive class were actual positive.
The weighted average method gives the labels with more predictions more importance (weight) in the final precision value comparing with labels with less predictions.
The weighted precision value.
Gives the recall value for all confusion matrix, taking in account the average weighted method.
Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.
The weighted average method gives the labels with more predictions more importance (weight) in the final recall value comparing with labels with less predictions.
The weighted recall value.
Gives the specificity value for all confusion matrix, taking in account the average weighted method.
Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.
The weighted average method gives the labels with more predictions more importance (weight) in the final specificity value comparing with labels with less predictions.
The weighted specificity value.
Generated using TypeDoc
Confusion matrix model which summarizes the prediction results on a classification problem.
The number of correct/incorrect predictions are summarized with count values and grouped by each class.
The matrix columns represents the true classes and the columns the predicted classes.
Consult wikipedia and Joydwip Mohajon, 2020 for more information regarding terminology, formulas and other theoretical concepts.