AI Glossary

Model Monitoring

The continuous tracking of deployed ML model performance to detect degradation and data drift.

Overview

Model monitoring is the practice of continuously observing deployed machine learning models to ensure they perform as expected in production. It tracks metrics like prediction accuracy, latency, throughput, and resource utilization, and detects issues like data drift, concept drift, and feature distribution changes.

Key Details

When model performance degrades, monitoring systems trigger alerts for retraining or rollback. Tools include Evidently AI, WhyLabs, Arize, and cloud-native monitoring solutions. Effective monitoring also tracks business metrics (conversion rates, user satisfaction) alongside technical metrics, since ML model quality ultimately impacts business outcomes.

Related Concepts

data driftconcept driftmodel serving

← Back to AI Glossary

Last updated: March 5, 2026