Options
All
  • Public
  • Public/Protected
  • All
Menu

Confusion matrix model which summarizes the prediction results on a classification problem.

The number of correct/incorrect predictions are summarized with count values and grouped by each class.

The matrix columns represents the true classes and the columns the predicted classes.

note

Consult wikipedia and Joydwip Mohajon, 2020 for more information regarding terminology, formulas and other theoretical concepts.

Hierarchy

  • ConfusionMatrix

Index

Constructors

constructor

  • new ConfusionMatrix(confusionMatrix?: { labels: string[]; matrix: number[][] }): ConfusionMatrix
  • Creates new instance of confusion matrix.

    example
    
    const confusionMatrix = new ConfusionMatrix({
      labels: ["Happiness", "Sadness"],
      matrix:
          [[1,2],
          [3,4]]
      });
    

    Parameters

    • Optional confusionMatrix: { labels: string[]; matrix: number[][] }
      • labels: string[]
      • matrix: number[][]

    Returns ConfusionMatrix

Properties

labels

labels: string[] = ...

Confusion matrix labels

matrix

matrix: number[][] = ...

Confusion matrix values.

The columns represents the true classes and the columns the predicted classes.

Private normalizations

normalizations: ConfusionMatrix[] = ...

Normalization history values.

Methods

accuracy

  • accuracy(configuration?: { average?: AverageMethod; label?: string }): number
  • Accuracy gives the fraction of total predictions which were correctly classified.

    Formula:

    accuracy = (TP + TN) / (TP + TN + FP + FN)

    Parameters

    • configuration: { average?: AverageMethod; label?: string } = ...

      Set of configurations used on accuracy calculations.

      [[configuration.label]] : The label name which will be used to calculate the accuracy value. If undefined or null, the accuracy value will be calculated for all confusion matrix.

      [[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the accuracy formula.

      [[configuration.average.Macro]]: Calculates and sums the accuracy for each individual label and divides for the number of labels.

      [[configuration.average.Weighted]]: Defines if the accuracy calculations should be weighted. This means the labels with more predictions will weight more in the final accuracy value comparing with labels with less. predictions.

    Returns number

    The accuracy value.

Private applyF1ScoreFormula

  • applyF1ScoreFormula(precision: number, recall: number): number
  • Applies the F1 Score formula

    Parameters

    • precision: number

      The precision value.

    • recall: number

      The recall value.

    Returns number

clone

Private deepCopy

  • deepCopy(object: any): any
  • Deep copies a given object.

    Parameters

    • object: any

      The object to be deep cloned.

    Returns any

    The deep cloned object.

f1Score

  • f1Score(configuration?: { average?: AverageMethod; label?: string }): number
  • F1 Score is the harmonic mean of precision and recall.

    Formula:

    f1Score = TP / (TN + FN)

    Parameters

    • Optional configuration: { average?: AverageMethod; label?: string }

      Set of configurations used on F1 Score calculations.

      [[configuration.label]] : The label name which will be used to calculate the F1 Score value. If undefined or null, the value will be calculated for all confusion matrix.

      [[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the F1 Score formula.

      [[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.

      [[configuration.average.Weighted]]: Defines if the F1 Score calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.

    Returns number

    The F1 Score value.

getAllMatrixClasses

  • Get all matrix classes, containing information about true positives, true negatives, false positives and false negatives, as well as the label associated with it.

    note

    Consult wikipedia for more information regarding terminology, formulas and other theoretical concepts.

    Returns { confusionMatrixClasses: ConfusionMatrixClasses; label: string }[]

    An array of matrix classes containing information about true positives, true negatives, false positives and false negatives, as well as the label associated with it.

getConfusionMatrixClasses

  • For one given label, returns the matrix classes (true positives, true negatives, false positives and false negatives).

    note

    Consult wikipedia for more information regarding terminology, formulas and other theoretical concepts.

    Parameters

    • label: string

    Returns ConfusionMatrixClasses

    The matrix classes (true positives, true negatives, false positives and false negatives).

getLabelsPredictionsSum

  • getLabelsPredictionsSum(): number[]
  • Returns the sum of predictions for all labels.

    Returns number[]

    Array with the sum of predictions for all labels. The position of the array corresponds to the label position on the confusion matrix.

getMinAndMax

  • getMinAndMax(): null | { max: number; min: number }
  • Gets the confusion matrix min and max value.

    Returns null | { max: number; min: number }

    Min and max confusion matrix value or null if not exists.

getNumberOfPredictions

  • getNumberOfPredictions(label?: string): number
  • Gets the total of predictions (samples) in a given label or on all confusion matrix.

    Parameters

    • Optional label: string

      Number of prediction for the given label. If null, undefined or empty, return the number of predictions for all confusion matrix.

    Returns number

getSumConfusionMatrixClasses

labelAccuracy

  • labelAccuracy(label: string): number
  • Gives the accuracy value for a given matrix label.

    Accuracy gives the fraction of total predictions which were correctly classified.

    Formula:

    accuracy = (TP + TN) / (TP + TN + FP + FN)

    Parameters

    • label: string

      The label used to get the accuracy value.

    Returns number

    Accuracy value for a given label.

labelF1Score

  • labelF1Score(label: string): number
  • Gives the F1 Score value for a given matrix label.

    F1 Score is the harmonic mean of precision and recall.

    Formula:

    f1Score = TP / (TN + FN)

    Parameters

    • label: string

      The label used to get theF 1 Score value.

    Returns number

    F1 Score value for a given label.

labelMissClassificationRate

  • labelMissClassificationRate(label: string): number
  • Gives the miss classification rate for a given matrix label.

    Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.

    Formula:

    missClassification = (FP + FN) / (TP + TN + FP + FN)

    Parameters

    • label: string

      The label used to get the miss classification rate value.

    Returns number

    Miss classification rate for a given label.

labelPrecision

  • labelPrecision(label: string): number
  • Gives the precision value for a given matrix label.

    Precision, stands for what fraction of predictions as a positive class were actual positive.

    Formula:

    precision = (TP) / (TP + FP)

    Parameters

    • label: string

      The label used to get the precision value.

    Returns number

    Precision value for a given label.

labelRecall

  • labelRecall(label: string): number
  • Gives the recall value for a given matrix label.

    Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.

    Formula:

    recall = TP / (TP + FN)

    Parameters

    • label: string

      The label used to get the recall value.

    Returns number

    Recall value for a given label.

labelSpecificity

  • labelSpecificity(label: string): number
  • Gives the specificity value for a given matrix label.

    Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.

    Formula:

    specificity = TP / (TN + FN)

    Parameters

    • label: string

      The label used to get the specificity value.

    Returns number

    Specificity value for a given label.

macroAccuracy

  • macroAccuracy(): number
  • Gives the accuracy value for all confusion matrix, taking in account the macro average method.

    Accuracy gives the fraction of total predictions which were correctly classified.

    The macro average method calculates and sums the accuracy for each individual label and divides for the number of labels.

    Returns number

    The macro accuracy value.

macroF1Score

  • macroF1Score(): number
  • Gives the F1 Score value for all confusion matrix, taking in account the macro average method.

    F1 Score is the harmonic mean of precision and recall.

    The macro average method calculates and sums the F1 Score for each individual label and divides for the number of labels.

    Returns number

    The macro F1 Score value.

macroMissClassificationRate

  • macroMissClassificationRate(): number
  • Gives the miss classification value for all confusion matrix, taking in account the macro average method.

    Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.

    The macro average method calculates and sums the miss classification for each individual label and divides for the number of labels.

    Returns number

    The macro miss classification value.

macroPrecision

  • macroPrecision(): number
  • Gives the precision value for all confusion matrix, taking in account the macro average method.

    Precision, gives what fraction of predictions as a positive class were actual positive.

    The macro average method calculates and sums the precision for each individual label and divides for the number of labels.

    Returns number

    The macro precision value.

macroRecall

  • macroRecall(): number
  • Gives the recall value for all confusion matrix, taking in account the macro average method.

    Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.

    The macro average method calculates and sums the recall for each individual label and divides for the number of labels.

    Returns number

    The macro recall value.

macroSpecificity

  • macroSpecificity(): number
  • Gives the specificity value for all confusion matrix, taking in account the macro average method.

    Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.

    The macro average method calculates and sums the specificity for each individual label and divides for the number of labels.

    Returns number

    The macro specificity value.

matrixAccuracy

  • Gives the accuracy value for all confusion matrix, taking in account a given average method of calculation.

    Accuracy gives the fraction of total predictions which were correctly classified.

    Parameters

    • average: AverageMethod = ...

      Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the accuracy formula.

      [[average.Macro]]: Calculates and sums the accuracy for each individual label and divides for the number of labels.

      [[average.Weighted]]: Defines if the accuracy calculations should be weighted. This means the labels with more predictions will weight more in the final accuracy value comparing with labels with less. predictions.

    Returns number

    The accuracy value.

matrixF1Score

  • Gives the F1 Score value for all confusion matrix, taking in account a given average method of calculation.

    F1 Score is the harmonic mean of precision and recall.

    Parameters

    • average: AverageMethod = ...

      Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the F1 Score formula.

      [[average.Macro]]: Calculates and sums the F1 Score for each individual label and divides for the number of labels.

      [[average.Weighted]]: Defines if the F1 Score calculations should be weighted. This means the labels with more predictions will weight more in the final F1 Score value comparing with labels with less. predictions.

    Returns number

    The F1 Score value.

matrixMissClassificationRate

  • Gives the miss classification rate value for all confusion matrix, taking in account a given average method of calculation.

    Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.

    Parameters

    • average: AverageMethod = ...

      Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the miss classification formula.

      [[average.Macro]]: Calculates and sums the miss classification for each individual label and divides for the number of labels.

      [[average.Weighted]]: Defines if the miss classification calculations should be weighted. This means the labels with more predictions will weight more in the final miss classification value comparing with labels with less. predictions.

    Returns number

    The miss classification value.

matrixPrecision

  • Gives the precision value for all confusion matrix, taking in account a given average method of calculation.

    Precision, stands for what fraction of predictions as a positive class were actual positive.

    Parameters

    • average: AverageMethod = ...

      Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the precision formula.

      [[average.Macro]]: Calculates and sums the precision for each individual label and divides for the number of labels.

      [[average.Weighted]]: Defines if the precision calculations should be weighted. This means the labels with more predictions will weight more in the final precision value comparing with labels with less. predictions.

    Returns number

    The precision value.

matrixRecall

  • Gives the recall value for all confusion matrix, taking in account a given average method of calculation.

    Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.

    Parameters

    • average: AverageMethod = ...

      Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the recall formula.

      [[average.Macro]]: Calculates and sums the recall for each individual label and divides for the number of labels.

      [[average.Weighted]]: Defines if the recall calculations should be weighted. This means the labels with more predictions will weight more in the final recall value comparing with labels with less. predictions.

    Returns number

    The recall value.

matrixSpecificity

  • Gives the specificity value for all confusion matrix, taking in account a given average method of calculation.

    Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.

    Parameters

    • average: AverageMethod = ...

      Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the specificity formula.

      [[average.Macro]]: Calculates and sums the specificity for each individual label and divides for the number of labels.

      [[average.Weighted]]: Defines if the specificity calculations should be weighted. This means the labels with more predictions will weight more in the final specificity value comparing with labels with less. predictions.

    Returns number

    The specificity value.

microAccuracy

  • microAccuracy(): number
  • Gives the accuracy value for all confusion matrix, taking in account the micro average method.

    Accuracy gives the fraction of total predictions which were correctly classified.

    The micro average method calculates and sums the accuracy for each individual label and divides for the number of labels.

    Returns number

    The micro accuracy value.

microF1Score

  • microF1Score(): number
  • Gives the F1 Score value for all confusion matrix, taking in account the micro average method.

    F1 Score is the harmonic mean of precision and recall.

    The micro average method calculates and sums the F1 Score for each individual label and divides for the number of labels.

    Returns number

    The micro F1 Score value.

microMissClassificationRate

  • microMissClassificationRate(): number
  • Gives the miss classification value for all confusion matrix, taking in account the micro average method.

    Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.

    The micro average method calculates and sums the miss classification for each individual label and divides for the number of labels.

    Returns number

    The micro miss classification value.

microPrecision

  • microPrecision(): number
  • Gives the precision value for all confusion matrix, taking in account the micro average method.

    Precision, gives what fraction of predictions as a positive class were actual positive.

    The micro average method calculates and sums the precision for each individual label and divides for the number of labels.

    Returns number

    The micro precision value.

microRecall

  • microRecall(): number
  • Gives the recall value for all confusion matrix, taking in account the micro average method.

    Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.

    The micro average method calculates and sums the recall for each individual label and divides for the number of labels.

    Returns number

    The micro recall value.

microSpecificity

  • microSpecificity(): number
  • Gives the specificity value for all confusion matrix, taking in account the micro average method.

    Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.

    The micro average method calculates and sums the specificity for each individual label and divides for the number of labels.

    Returns number

    The micro specificity value.

missClassificationRate

  • missClassificationRate(configuration?: { average?: AverageMethod; label?: string }): number
  • Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.

    Formula:

    accuracy = (FP + FN) / (TP + TN + FP + FN)

    Parameters

    • configuration: { average?: AverageMethod; label?: string } = ...

      Set of configurations used on miss classification rate calculations.

      [[configuration.label]] : The label name which will be used to calculate the miss classification rate. If undefined or null, the value will be calculated for all confusion matrix.

      [[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the miss classification formula.

      [[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.

      [[configuration.average.Weighted]]: Defines if the miss classification calculations should be weighted. This means the labels with more predictions will weight more in the final rate value comparing with labels with less predictions.

    Returns number

    The miss classification rate value.

normalize

  • normalize(min?: number, max?: number, fractionDigits?: FractionDigits): void
  • Normalizes all values of the matrix between two given values.

    All normalizations will be saved in history and it is possible to revert last normalizations by calling the function @see ConfusionMatrix.revertNormalization.

    note

    Can be special util if you want to convert the values to percentage or between [0, 1].

    Parameters

    • min: number = 0

      Minimum value of the normalized range values [min, max].

    • max: number = 1

      Maximum value of the normalized range values [min, max].

    • Optional fractionDigits: FractionDigits

      — Number of digits after the decimal point. Must be in the range 0 - 20, inclusive.

    Returns void

precision

  • precision(configuration?: { average?: AverageMethod; label?: string }): number
  • Precision, gives what fraction of predictions as a positive class were actual positive.

    Formula:

    precision = (TP) / (TP + FP)

    Parameters

    • configuration: { average?: AverageMethod; label?: string } = ...

      Set of configurations used on precision calculations.

      [[configuration.label]] : The label name which will be used to calculate the precision value. If undefined or null, the value will be calculated for all confusion matrix.

      [[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the precision formula.

      [[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.

      [[configuration.average.Weighted]]: Defines if the precision calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.

    Returns number

    The precision value.

recall

  • recall(configuration?: { average?: AverageMethod; label?: string }): number
  • Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.

    Formula:

    recall = TP / (TP + FN)

    Parameters

    • configuration: { average?: AverageMethod; label?: string } = ...

      Set of configurations used on recall calculations.

      [[configuration.label]] : The label name which will be used to calculate the recall value. If undefined or null, the value will be calculated for all confusion matrix.

      [[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the recall formula.

      [[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.

      [[configuration.average.Weighted]]: Defines if the recall calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.

    Returns number

    The recall value.

revertAllNormalizations

  • revertAllNormalizations(): void

revertNormalization

  • Reverts the last normalization occur getting the current confusion matrix.

    Returns null | ConfusionMatrix

    The confusion matrix object before the normalization. If there is not any entry on the history, null will be returned.

setConfusionMatrix

specificity

  • specificity(configuration?: { average?: AverageMethod; label?: string }): number
  • Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.

    Formula:

    specificity = TP / (TN + FN)

    Parameters

    • configuration: { average?: AverageMethod; label?: string } = ...

      Set of configurations used on specificity calculations.

      [[configuration.label]] : The label name which will be used to calculate the specificity value. If undefined or null, the value will be calculated for all confusion matrix.

      [[configuration.average]]: Defines which type of average should be used. This average will only be taken in account. on matrix calculations (when label = null || undefined).

      [[configuration.average.Micro]]: Calculates the TP, TN, FP and FN for the matrix globally and then applies the specificity formula.

      [[configuration.average.Macro]]: Calculates and sums the miss classification rate for each individual label and divides for the number of labels.

      [[configuration.average.Weighted]]: Defines if the specificity calculations should be weighted. This means the labels with more predictions will weight more in the final value comparing with labels with less predictions.

    Returns number

    The specificity value.

transpose

  • transpose(): void

validate

  • validate(): void

weightedAccuracy

  • weightedAccuracy(): number
  • Gives the accuracy value for all confusion matrix, taking in account the average weighted method.

    Accuracy gives the fraction of total predictions which were correctly classified.

    The weighted average method gives the labels with more predictions more importance (weight) in the final accuracy value comparing with labels with less predictions.

    Returns number

    The weighted accuracy value.

weightedF1Score

  • weightedF1Score(): number
  • Gives the F1 Score value for all confusion matrix, taking in account the average weighted method.

    F1 Score is the harmonic mean of precision and recall.

    The weighted average method gives the labels with more predictions more importance (weight) in the final F1 Score value comparing with labels with less predictions.

    Returns number

    The weighted F1 Score value.

weightedMissClassificationRate

  • weightedMissClassificationRate(): number
  • Gives the miss classification value for all confusion matrix, taking in account the average weighted method.

    Misclassification rate, also know as classification error and 1-Accuracy, calculates the faction of predictions were incorrect.

    The weighted average method gives the labels with more predictions more importance (weight) in the final miss classification value comparing with labels with less predictions.

    Returns number

    The weighted miss classification value.

weightedPrecision

  • weightedPrecision(): number
  • Gives the precision value for all confusion matrix, taking in account the average weighted method.

    Precision, gives what fraction of predictions as a positive class were actual positive.

    The weighted average method gives the labels with more predictions more importance (weight) in the final precision value comparing with labels with less predictions.

    Returns number

    The weighted precision value.

weightedRecall

  • weightedRecall(): number
  • Gives the recall value for all confusion matrix, taking in account the average weighted method.

    Recall, also know as true positive rate, sensitivity, hit rate and probability of detection, gives what fraction of all positives classes correctly predicted as positive.

    The weighted average method gives the labels with more predictions more importance (weight) in the final recall value comparing with labels with less predictions.

    Returns number

    The weighted recall value.

weightedSpecificity

  • weightedSpecificity(): number
  • Gives the specificity value for all confusion matrix, taking in account the average weighted method.

    Specificity also know as selectivity or true negative rate, gives what fraction of all negatives samples are correctly as negative.

    The weighted average method gives the labels with more predictions more importance (weight) in the final specificity value comparing with labels with less predictions.

    Returns number

    The weighted specificity value.

Generated using TypeDoc