AI Glossary

Bagging (Bootstrap Aggregating)

An ensemble method that trains multiple models on random subsets of the training data, then averages their predictions to reduce variance and prevent overfitting.

How It Works

Create multiple bootstrap samples (random sampling with replacement) from the training data. Train an independent model on each sample. For prediction, average the outputs (regression) or take a majority vote (classification).

Random Forest

The most famous bagging algorithm. Combines bagging with random feature selection at each split, creating a diverse ensemble of decision trees that is robust, accurate, and requires minimal tuning.

← Back to AI Glossary

Last updated: March 5, 2026