Boosting
An ensemble technique that trains models sequentially, with each new model focusing on correcting the errors of the previous ones, creating a strong learner from weak ones.
Key Algorithms
AdaBoost: Reweights misclassified examples. Gradient Boosting: Fits new models to residual errors. XGBoost: Regularized, parallelized gradient boosting. LightGBM: Faster with leaf-wise growth. CatBoost: Handles categorical features natively.
Strengths
Boosting methods dominate tabular data competitions and real-world applications. XGBoost/LightGBM are often the best choice for structured data, outperforming deep learning when data is limited.