For imbalanced class problems. Higher precision leads to less false [...].
Answer
positives
status
not learned
measured difficulty
37% [default]
last interval [days]
repetition number in this series
0
memorised on
scheduled repetition
scheduled repetition interval
last repetition or drill
Parent (intermediate) annotation
Open it Precision For imbalanced class problems. Higher precision leads to less false positives.
Original toplevel document
TfC_02_classification-PART_2 Classification evaluation methods Accuracy tf.keras.metrics.Accuracy() sklearn.metrics.accuracy_score() Not the best for imbalanced classes Precision For imbalanced class problems. Higher precision leads to less false positives. Recall Higher recall leads to less false negatives. Tradeoff between recall and precision. F1-score Combination of precision and recall, ususally a good overall metric for classificatio