An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: self-attentive-rnn

nicolay-r/AREnets

Tensorflow-based framework which lists attentive implementation of the conventional neural network models (CNN, RNN-based), applicable for Relation Extraction classification tasks as well as API for custom model implementation

Language: Python - Size: 1.34 MB - Last synced at: 1 day ago - Pushed at: 1 day ago - Stars: 7 - Forks: 0

cbaziotis/neat-vision

Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)

Language: Vue - Size: 25.4 MB - Last synced at: about 1 month ago - Pushed at: about 7 years ago - Stars: 250 - Forks: 24

kaushalshetty/Structured-Self-Attention

A Structured Self-attentive Sentence Embedding

Language: Python - Size: 492 KB - Last synced at: 6 months ago - Pushed at: over 5 years ago - Stars: 495 - Forks: 106

ranfysvalle02/ai-self-attention

This repository provides a basic implementation of self-attention. The code demonstrates how attention mechanisms work in predicting the next word in a sequence. It's a basic implementation that demonstrates the core concept of attention but lacks the complexity of more advanced models like Transformers.

Language: Python - Size: 263 KB - Last synced at: 21 days ago - Pushed at: 8 months ago - Stars: 3 - Forks: 0

timbmg/Structured-Self-Attentive-Sentence-Embedding

Re-Implementation of "A Structured Self-Attentive Sentence Embedding" by Lin et al., 2017

Language: Python - Size: 690 KB - Last synced at: about 1 month ago - Pushed at: about 2 months ago - Stars: 24 - Forks: 1

shikhararyan/Text-Classification-Transformer-Model-

This sentiment analysis model utilizes a Transformer architecture to classify text sentiment into positive, negative, or neutral categories with high accuracy. It preprocesses text data, trains the model on the IMDB dataset, and effectively predicts sentiment based on user input.

Language: Jupyter Notebook - Size: 3.39 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 1 - Forks: 0

flrngel/Self-Attentive-tensorflow 📦

Tensorflow implementation of "A Structured Self-Attentive Sentence Embedding"

Language: Python - Size: 1.4 MB - Last synced at: about 1 year ago - Pushed at: over 3 years ago - Stars: 192 - Forks: 39

arunarn2/StructuredSelfAttentionTensorflow

Structured Self Attention implementation in tensorflow

Language: Python - Size: 1.21 MB - Last synced at: over 1 year ago - Pushed at: over 6 years ago - Stars: 5 - Forks: 1

yaya-1302/SAN-HeadlineClickBait

Aplikasi ini dibuat untuk membantu pengguna dalam menentukan apakah sebuah berita yang ingin dibaca termasuk clickbait atau bukan.

Language: CSS - Size: 29.5 MB - Last synced at: 10 months ago - Pushed at: over 2 years ago - Stars: 2 - Forks: 0

keya-desai/Natural-Language-Processing

Python implementation of N-gram Models, Log linear and Neural Linear Models, Back-propagation and Self-Attention, HMM, PCFG, CRF, EM, VAE

Language: Python - Size: 15 MB - Last synced at: about 2 years ago - Pushed at: over 4 years ago - Stars: 2 - Forks: 2

arianhosseini/MemArchs-in-RNNLM

attempt at implementing "Memory Architectures in Recurrent Neural Network Language Models" as a part of the ICLR 2018 reproducibility challenge

Language: Python - Size: 43.9 KB - Last synced at: about 1 month ago - Pushed at: over 7 years ago - Stars: 5 - Forks: 2

PrashantRanjan09/Structured-Self-Attentive-Sentence-Embedding

Implementation of the Paper Structured Self-Attentive Sentence Embedding published in ICLR 2017

Language: Python - Size: 279 KB - Last synced at: 4 months ago - Pushed at: almost 7 years ago - Stars: 1 - Forks: 2