AI Glossary

Catastrophic Interference

The tendency of neural networks to abruptly forget previously learned information when trained on new data.

Overview

Catastrophic interference (also called catastrophic forgetting) occurs when a neural network trained on new tasks or data loses its ability to perform previously learned tasks. The new training overwrites the weights that stored earlier knowledge.

Solutions

Approaches to mitigate catastrophic interference include elastic weight consolidation (EWC), progressive neural networks, experience replay (revisiting old data), and modular architectures where different modules handle different tasks. This challenge is central to continual learning research.

← Back to AI Glossary

Last updated: March 5, 2026