AI Glossary

Training Loss

A numerical measure of how wrong the model's predictions are on training data, which the optimization algorithm works to minimize during training.

Common Loss Functions

Cross-entropy: Classification and language modeling. MSE: Regression. Contrastive: Embedding learning. Focal loss: Imbalanced classification. Huber: Robust regression.

Monitoring

Training loss should decrease over time. If it plateaus, the learning rate may be too low or the model may need architectural changes. If training loss decreases but validation loss increases, the model is overfitting.

← Back to AI Glossary

Last updated: March 5, 2026