Continual Learning
The ability of a model to learn from a continuous stream of data over time, accumulating knowledge without forgetting previously learned information.
The Challenge
Neural networks suffer from catastrophic forgetting -- learning new tasks overwrites old knowledge. Continual learning aims to solve this, enabling models that grow their capabilities over time.
Approaches
Replay: Store and rehearse old examples. Regularization: Penalize changes to important weights (EWC). Architecture: Add new capacity for new tasks. Parameter isolation: Dedicate different parameters to different tasks.