Now you'll see why 99% accuracy can be completely worthless. This is the most important lesson in metrics.
🚨 Critical: Try "Imbalanced (99:1)" mode
You MUST switch to imbalanced data. The demonstration won't make sense until you see a model achieve 99% accuracy while being completely useless.
Confusion Matrix
What Optimizing for ACCURACY Means:
You care about overall correctness. Works well for balanced data. Fails catastrophically on imbalanced data (like fraud detection).
Understanding the Confusion Matrix
✅ True Positive (TP)
Said "yes" and was correct. The good hits.
🚨 False Positive (FP)
Said "yes" but was wrong. False alarms.
😰 False Negative (FN)
Said "no" but was wrong. Missed cases.
✅ True Negative (TN)
Said "no" and was correct. Correctly ignored.
💡 Why Accuracy Lies
On imbalanced data, a model can predict "always no" and get 99% accuracy — while catching zero of the cases you actually care about. That's why precision and recall exist: they measure what accuracy hides.