GitHub topics: additive-attention
sooftware/attentions
PyTorch implementation of some attentions for Deep Learning Researchers.
Language: Python - Size: 80.1 KB - Last synced at: 19 days ago - Pushed at: about 3 years ago - Stars: 529 - Forks: 70

mtanghu/LEAP
LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference
Language: Jupyter Notebook - Size: 3 MB - Last synced at: 18 days ago - Pushed at: almost 2 years ago - Stars: 4 - Forks: 0

shawnhan108/Attention-LSTMs
A set of notebooks that explores the power of Recurrent Neural Networks (RNNs), with a focus on LSTM, BiLSTM, seq2seq, and Attention.
Language: Jupyter Notebook - Size: 158 MB - Last synced at: about 2 years ago - Pushed at: over 4 years ago - Stars: 6 - Forks: 1
