Vector Embeddings Explained: Semantic Search with Numerical Representations
AI Impact Summary
Vector embeddings are numeric representations of data, like text or images, that capture semantic meaning. These embeddings are generated using machine learning models, often transformer models like BERT, and represent objects as arrays of numbers. The quality of these embeddings is crucial for semantic search, where the goal is to find items with similar meaning rather than just matching keywords, as demonstrated by the example of finding wine descriptions based on ‘fish’ or ‘seafood’ instead of a simple keyword search.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info