AI Glossary

Precision and Recall

Two complementary classification metrics. Precision measures correctness of positive predictions; recall measures completeness of detecting actual positives.

Definitions

Precision: Of all items the model predicted as positive, what fraction actually were positive? Precision = TP / (TP + FP). Recall: Of all actual positive items, what fraction did the model correctly identify? Recall = TP / (TP + FN).

The Tradeoff

Increasing recall (catching more positives) typically decreases precision (more false alarms), and vice versa. The right balance depends on the application: spam filtering favors precision (don't lose real emails), cancer screening favors recall (don't miss any cases).

F1 Score

The harmonic mean of precision and recall: F1 = 2 * (P * R) / (P + R). It provides a single number that balances both metrics. Useful when you need a single metric but care about both precision and recall.

← Back to AI Glossary

Last updated: March 5, 2026