0% found this document useful (0 votes)
22 views8 pages

Intro To Vector Embeddings

Uploaded by

mitmak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views8 pages

Intro To Vector Embeddings

Uploaded by

mitmak
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Community Series: Master Vector Embeddings with Weaviate

Part 1: Introduction to
Vector Embeddings

(c) Copyrights Reserved https://datasciencedojo.com


1

What are Vector


Embeddings?
Embeddings are numerical representations of words or
phrases in a high-dimensional vector space.

These representations map discrete objects (such as


words, sentences, or images) into a continuous latent
space, capturing their semantic and contextual
relationship.

(c) Copyrights Reserved https://datasciencedojo.com


2

How do Embeddings
Work?
They translate textual data into vectors within a
continuous latent space.

This enables models to perform mathematical


operations on text data.

It helps to interpret and generate human language with


greater accuracy and context-awareness.

(c) Copyrights Reserved https://datasciencedojo.com


3

Role of Vector
Embeddings in LLMs

(c) Copyrights Reserved https://datasciencedojo.com


4

Types of
Embeddings

(c) Copyrights Reserved https://datasciencedojo.com


5

Cosine similarity for


vector similarity
Cosine similarity is a measure in LLMs for evaluating the
semantic similarity between embeddings.

It is used in semantic search as it provides a robust and


efficient way to compare textual data based on its
meaning, driving significant advancements in NLP and
AI-driven applications.

(c) Copyrights Reserved https://datasciencedojo.com


6

Semantic Encoding
Techniques
Semantic encoding techniques are the most recent
approach to embedding words.

These techniques use neural networks to create vector


representations of words that capture their meaning.

(c) Copyrights Reserved https://datasciencedojo.com


To learn more, join us on
Thursday, 06 February at 9AM
PDT for Part 1: What are Vector
Embeddings?

Victoria Slocum
Machine Learning Engineer
Weaviate

(c) Copyrights Reserved https://datasciencedojo.com

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy