Vector Arithmetic
Algebraic Operations on Meaning
A notable property of word embedding spaces is that vector arithmetic preserves semantic relationships, producing results that are interpretable.
king - man + woman = queen
This result obtains because the vector difference "king" − "man" isolates the "royalty" component while removing the "male" component. Adding "woman" reintroduces a gendered component, yielding "female royalty" — queen.
Try it yourself
Vector Equation Builder
Build an equation: [word1] - [word2] + [word3] = ?
Top 5 Results
Click Calculate to see results
Try these examples:
Relational Structure in Vector Space
Vector arithmetic demonstrates that word embeddings encode relational structure, not merely pairwise similarity. The displacement vector from "man" to "woman" is approximately parallel to the displacement from "king" to "queen."
man → woman
Gender transformation
king → queen
Same transformation + royalty
From Static to Contextual Representations
Static word vectors (Word2Vec, GloVe) represented a major advance in the 2010s. Modern language models such as Claude extend this approach with contextual embeddings, where each token's vector is conditioned on its surrounding context.
The foundational insight persists: semantic content can be represented as geometric structure in high-dimensional space. This principle underlies all contemporary language model architectures.
Key Takeaways
- Vector arithmetic reveals that embeddings encode relational structure, not just similarity
- "king − man + woman ≈ queen" demonstrates systematic concept decomposition
- Modern language models extend static vectors with context-dependent representations