Embedding Space
A continuous vector space where items (words, images, users) are represented as points, with spatial relationships encoding semantic similarity and meaning.
Properties
Similar items cluster together. Distances between points reflect semantic relationships. Vector arithmetic captures analogies: king - man + woman ≈ queen. Dimensions often encode interpretable features like sentiment, topic, or style.
Types
Word embeddings: Word2Vec, GloVe, FastText. Sentence embeddings: Sentence-BERT, text-embedding-ada-002. Image embeddings: CLIP, ResNet features. Multi-modal: CLIP maps images and text to the same space.
Applications
Semantic search and retrieval. Recommendation engines. Clustering and classification. RAG systems (finding relevant documents). Anomaly detection (outliers in embedding space).