GitHub topics: mixture-of-models
kyegomez/SwitchTransformers
Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"
Language: Python - Size: 2.42 MB - Last synced at: 9 days ago - Pushed at: 2 months ago - Stars: 104 - Forks: 13

VinsmokeSomya/Mixture-of-Idiots
🤪🧠💥 Mixture of Idiots (MoI): A Python project exploring 'Mixture of Models' (MOM) to solve complex problems by combining outputs from multiple LLMs (OpenAI, MistralAI, Gemini) using King, Duopoly, and Democracy architectures. Sometimes, a team of 'idiots' is surprisingly brilliant!
Language: Python - Size: 16.6 KB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 0 - Forks: 0

andriygav/MixtureLib
The implementation of mixtures for different tasks.
Language: Python - Size: 9.18 MB - Last synced at: about 1 month ago - Pushed at: 8 months ago - Stars: 2 - Forks: 0
