AI Glossary

Sentence Transformers

A framework that produces semantically meaningful sentence embeddings for comparison and search.

Overview

Sentence Transformers (SBERT) are transformer models fine-tuned to produce dense vector representations of entire sentences or paragraphs, enabling efficient semantic similarity comparison. Unlike using BERT directly (which requires feeding both sentences together), SBERT generates independent embeddings that can be compared with cosine similarity.

Key Details

This architecture is crucial for semantic search, clustering, and retrieval-augmented generation (RAG). Models like all-MiniLM-L6-v2 and BGE provide excellent quality-speed tradeoffs. Sentence Transformers enable applications that need to compare millions of text pairs efficiently.

Related Concepts

bertembeddingssemantic search

← Back to AI Glossary

Last updated: March 5, 2026