Topic: "sparse-moe"
SuperBruceJia/Awesome-Mixture-of-Experts
Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)
Size: 438 KB - Last synced at: about 6 hours ago - Pushed at: 4 months ago - Stars: 27 - Forks: 3

Related Topics
artificial-intelligence
1
expert-network
1
foundation-models
1
gating-network
1
large-language-model
1
large-language-models
1
large-vision-language-models
1
llms
1
llms-benchmarking
1
llms-reasoning
1
load-balancing
1
mixtrure-of-multimodal-experts
1
mixture-of-experts
1
moe
1
mome
1
multimodal-learning
1
sparse
1
sparse-mixture-of-experts
1
sparse-mixture-of-multimodal-experts
1