AI Glossary

Word2Vec

A neural network model that learns dense vector representations of words from large text corpora.

Overview

Word2Vec, introduced by Mikolov et al. at Google in 2013, is a technique for learning word embeddings — dense vector representations where semantically similar words are mapped to nearby points. It uses two architectures: Skip-gram (predicting context words from a target word) and CBOW (predicting a target word from context words).

Key Details

Word2Vec famously demonstrated that learned embeddings capture semantic relationships through vector arithmetic, such as king - man + woman = queen. While largely superseded by contextual embeddings like BERT, Word2Vec remains foundational to understanding how neural networks represent language.

Related Concepts

embeddingsglove embeddingsword embedding

← Back to AI Glossary

Last updated: March 5, 2026