AI Glossary

Abstractive Summarization

An NLP technique that generates new, concise text capturing key information from a source document, rather than simply extracting existing sentences.

How It Works

The model reads the full document and generates a novel summary using its own words. Unlike extractive summarization (selecting existing sentences), abstractive methods can paraphrase, combine ideas, and produce more natural, readable summaries.

Models

BART: Facebook's sequence-to-sequence model. T5: Google's text-to-text transformer. Pegasus: Specifically pre-trained for summarization. Modern LLMs (GPT-4, Claude) excel at abstractive summarization as a general capability.

Challenges

Factual consistency — generated summaries may introduce errors not in the source. Maintaining the right level of detail. Handling very long documents that exceed context windows. Evaluating quality (ROUGE scores don't capture factual accuracy).

← Back to AI Glossary

Last updated: March 5, 2026