GitHub topics: conditional-computation
Adlith/MoE-Jetpack
[NeurIPS 24] MoE Jetpack: From Dense Checkpoints to Adaptive Mixture of Experts for Vision Tasks
Language: Python - Size: 32.3 MB - Last synced at: 3 months ago - Pushed at: 12 months ago - Stars: 130 - Forks: 1
lucidrains/st-moe-pytorch
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
Language: Python - Size: 178 KB - Last synced at: 6 months ago - Pushed at: over 1 year ago - Stars: 332 - Forks: 29
antonio-f/mixture-of-experts-from-scratch
Mixture of Experts from scratch
Language: Jupyter Notebook - Size: 234 KB - Last synced at: 7 months ago - Pushed at: over 1 year ago - Stars: 6 - Forks: 1
Mixture-AI/Mixture-of-Depths
Google DeepMind: Mixture of Depths Unofficial Implementation.
Language: Python - Size: 15.6 KB - Last synced at: over 1 year ago - Pushed at: over 1 year ago - Stars: 1 - Forks: 1
thomasverelst/dynconv
Code for Dynamic Convolutions: Exploiting Spatial Sparsity for Faster Inference (CVPR2020)
Language: Cuda - Size: 105 MB - Last synced at: about 2 years ago - Pushed at: almost 4 years ago - Stars: 120 - Forks: 14