AI Glossary

Underfitting

When a model is too simple to capture the underlying patterns in the data, resulting in poor performance on both training and test sets.

Signs of Underfitting

High training error AND high test error. The model fails to learn even the training data well. The learning curve shows training loss plateauing at a high value.

Causes

Model too small (not enough parameters or layers), insufficient training time (too few epochs), excessive regularization, poor feature selection, or an inappropriate model architecture for the task.

Solutions

Increase model complexity, train longer, reduce regularization, engineer better features, or switch to a more expressive architecture. Underfitting is generally easier to fix than overfitting.

← Back to AI Glossary

Last updated: March 5, 2026