AI Glossary

Text Embedding

A dense vector representation of text (word, sentence, or document) that captures semantic meaning in a numerical format suitable for machine learning.

Evolution

Word2Vec/GloVe (static word embeddings) -> BERT (contextual embeddings) -> Sentence-BERT (sentence embeddings) -> Modern embedding models (text-embedding-3, BGE, E5). Each generation improved quality and versatility.

Use Cases

Semantic search, RAG retrieval, clustering, classification, recommendation, deduplication, and as input features for downstream ML models.

← Back to AI Glossary

Last updated: March 5, 2026