AI Glossary

Bias-Variance Tradeoff

The fundamental tension in machine learning between a model's ability to fit training data (low bias) and its ability to generalize to new data (low variance).

Understanding the Tradeoff

Bias is the error from overly simplistic assumptions. High-bias models underfit -- they miss patterns in the data. Variance is the error from sensitivity to training data fluctuations. High-variance models overfit -- they memorize noise.

The Sweet Spot

The goal is to find the model complexity that minimizes total error (bias + variance). Too simple and you underfit; too complex and you overfit. Techniques like cross-validation, regularization, and ensemble methods help navigate this tradeoff.

Modern Perspective

Deep learning has challenged the classical tradeoff. Very large models can achieve low bias AND low variance through 'double descent' -- a phenomenon where increasing model size past the interpolation threshold can improve generalization.

← Back to AI Glossary

Last updated: March 5, 2026