GitHub topics: local-attention
agentdr1/LA_MIL
Implementation of LA_MIL, Local Attention Graph-based Transformer for WSIs, PyTorch
Language: Python - Size: 686 KB - Last synced at: about 1 month ago - Pushed at: over 1 year ago - Stars: 25 - Forks: 0

Komil-parmar/ViT_with_Local_Attention
🚀 Self-implemented Vision Transformer (ViT) with Local Attention! Unlike standard ViT, this version integrates local attention for improved efficiency. Fully customizable with configurable patch embeddings, attention mechanisms, transformer layers as well as mixing global and local attention.
Language: Python - Size: 9.77 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 0 - Forks: 0

THU-LYJ-Lab/AR-Seg
[CVPR 2023] Efficient Semantic Segmentation by Altering Resolutions for Compressed Videos
Language: Python - Size: 12.8 MB - Last synced at: 20 days ago - Pushed at: about 1 year ago - Stars: 50 - Forks: 9

JRC1995/Abstractive-Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Language: Jupyter Notebook - Size: 330 KB - Last synced at: over 1 year ago - Pushed at: over 5 years ago - Stars: 163 - Forks: 59

mtanghu/LEAP
LEAP: Linear Explainable Attention in Parallel for causal language modeling with O(1) path length, and O(1) inference
Language: Jupyter Notebook - Size: 3 MB - Last synced at: 19 days ago - Pushed at: almost 2 years ago - Stars: 4 - Forks: 0

Siddhant-Ray/Inductive-Biases-in-CNNs-vs-Transformers
Investigating inductive biases in CNNs vs Transformers. Code and report for the Deep Learning Course Project, ETH Zurich, HS 2021.
Language: Jupyter Notebook - Size: 60.7 MB - Last synced at: 12 months ago - Pushed at: about 2 years ago - Stars: 0 - Forks: 1

gugundi/NeuralMachineTranslation
Neural Machine Tranlation using Local Attention
Language: Jupyter Notebook - Size: 254 MB - Last synced at: almost 2 years ago - Pushed at: over 6 years ago - Stars: 6 - Forks: 3
