0% found this document useful (0 votes)
6 views29 pages

Lesson 13

The document discusses word embeddings and their applications in semantic spaces, highlighting methods for both supervised and unsupervised learning, such as CBOW and Skip-gram models. It emphasizes the importance of selecting hard negatives for faster convergence in contrastive loss functions and introduces simCLR for robust representation learning. Additionally, it touches on representational dissimilarity and similarity analyses in neuroscience, comparing brain activity to deep neural networks.

Uploaded by

Toyba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views29 pages

Lesson 13

The document discusses word embeddings and their applications in semantic spaces, highlighting methods for both supervised and unsupervised learning, such as CBOW and Skip-gram models. It emphasizes the importance of selecting hard negatives for faster convergence in contrastive loss functions and introduces simCLR for robust representation learning. Additionally, it touches on representational dissimilarity and similarity analyses in neuroscience, comparing brain activity to deep neural networks.

Uploaded by

Toyba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Lesson 13

Embeddings
Semantic Space
Word embeddings
Word embeddings
• How to handle discrete data?
• Solution 1: 1-hot vectors
• Solution 2: dense vectors
Semantic spaces

Plants
Food Buildings

Animals
Tools
Robots
Automobiles
Supervised learning of embedding
It is often easier to obtain relative similarity

We wish to find an embedding such that

• Define a hinge-loss:
• Map every word
• Learn the mapping with a deep network
Siamese network
The importance of selecting hard negatives
• Contrastive loss is a ranking loss, only the order matters
• Selecting hard negatives is key to faster convergence
Unsupervised learning of embedding
• Use the context:
• The dog chased the little ______, who climbed up the tree
Unsupervised learning of embedding
Word2Vec: continuous bag of words (CBOW) version
Map the output to probabilities
using softmax:

Train using maximum likelihood

A simple CBOW model with only one


word in the context
Unsupervised learning of embedding

A simple CBOW model


with multiple words in
the context
Unsupervised learning of embedding
Skip-gram:
simCLR: simple Contrastive learning
Learn representations that are robust against transformations :
1. Data augmentation
2. Base encoder: CNN (e.g. ResNet)
3. Projection head (e.g. 2-layer MLP)
4. Contrastive loss function

Projection head can be removed,


and then use representation for other tasks.

Chen et al. 2020


Word algebra
Word algebra
Embeddings and sematic spaces
in neuroscience
Hebart et al. 2020
Hebart et al. 2020
Hebart et al. 2020
Hebart et al. 2020
Hebart et al. 2020
Hebart et al. 2020
Hebart et al. 2020
Representational Dissimilarity Matrix

Kriegeskorte et al. 2008


Representational Dissimilarity Matrix

Kriegeskorte et al. 2008


Khaligh-Razavi & Kriegeskorte 2014
Khaligh-Razavi & Kriegeskorte 2014
Representational Similarity Analysis

Sartzetaki et al. 2024


Representational Similarity Analysis

Sartzetaki et al. 2024


Comparing brains to DNNs
• Black-box approach: output and behavior
• White-box approach:
– Encoding models: linear combination of layer predicts neural activity
– RSA: comparing representation dissimilarities of many images (either
dissimilarities themselves, or the structure)
– Pattern component modeling
• Topographical DNNs

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy