AI Glossary

T5

A transformer model that frames all NLP tasks as text-to-text problems with a unified architecture.

Overview

T5 (Text-to-Text Transfer Transformer), introduced by Google Research in 2019, is a transformer model that converts every NLP task into a text-to-text format. Translation, summarization, classification, and question answering are all handled by generating text output from text input, prefixed with a task description.

Key Details

This unified approach simplifies multi-task learning and transfer learning. T5 was trained on the C4 (Colossal Clean Crawled Corpus) dataset and demonstrated strong performance across diverse benchmarks. Its architecture uses a standard encoder-decoder design, and variants like Flan-T5 (instruction-tuned) remain popular for research and production.

Related Concepts

transformerencoder decoderbert

← Back to AI Glossary

Last updated: March 5, 2026