Skip to content
#

self-attentive-rnn

Here are 13 public repositories matching this topic...

Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)

  • Updated May 4, 2018
  • Vue
AREnets

Tensorflow-based framework which lists attentive implementation of the conventional neural network models (CNN, RNN-based), applicable for Relation Extraction classification tasks as well as API for custom model implementation

  • Updated May 11, 2025
  • Python

This repository provides a basic implementation of self-attention. The code demonstrates how attention mechanisms work in predicting the next word in a sequence. It's a basic implementation that demonstrates the core concept of attention but lacks the complexity of more advanced models like Transformers.

  • Updated Sep 23, 2024
  • Python

This sentiment analysis model utilizes a Transformer architecture to classify text sentiment into positive, negative, or neutral categories with high accuracy. It preprocesses text data, trains the model on the IMDB dataset, and effectively predicts sentiment based on user input.

  • Updated Apr 5, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the self-attentive-rnn topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the self-attentive-rnn topic, visit your repo's landing page and select "manage topics."

Learn more

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy