AI Glossary

Model Merging

Combining the weights of multiple fine-tuned models into a single model that inherits capabilities from all parent models, without additional training.

Methods

Linear interpolation: Average weights with configurable ratios. SLERP: Spherical interpolation for smoother merging. TIES: Trim, Elect, and Sign merging for reducing interference. DARE: Drop and Rescale for merging many models.

Why It Works

Fine-tuned models starting from the same base share most of their representation space. Merging creates a model that combines specialized knowledge (e.g., coding ability from one model + instruction following from another) without retraining.

← Back to AI Glossary

Last updated: March 5, 2026