An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: moe

bcgov/reserve-rec-api

For the Parks and Recreation Digital Transformation project.

Language: JavaScript - Size: 1000 KB - Last synced at: about 23 hours ago - Pushed at: 1 day ago - Stars: 1 - Forks: 5

bcgov/reserve-rec-public

For the Parks and Recreation Digital Transformation project.

Language: TypeScript - Size: 2.51 MB - Last synced at: 2 days ago - Pushed at: 2 days ago - Stars: 1 - Forks: 4

MoonshotAI/MoBA

MoBA: Mixture of Block Attention for Long-Context LLMs

Language: Python - Size: 2.4 MB - Last synced at: 2 days ago - Pushed at: about 2 months ago - Stars: 1,775 - Forks: 105

PKU-YuanGroup/MoE-LLaVA

Mixture-of-Experts for Large Vision-Language Models

Language: Python - Size: 16.5 MB - Last synced at: 2 days ago - Pushed at: 6 months ago - Stars: 2,162 - Forks: 133

microsoft/Tutel

Tutel MoE: Optimized Mixture-of-Experts Library, Support DeepSeek FP8/FP4

Language: C - Size: 1.17 MB - Last synced at: 1 day ago - Pushed at: 3 days ago - Stars: 823 - Forks: 97

czy0729/Bangumi

:electron: An unofficial https://bgm.tv ui first app client for Android and iOS, built with React Native. 一个无广告、以爱好为驱动、不以盈利为目的、专门做 ACG 的类似豆瓣的追番记录,bgm.tv 第三方客户端。为移动端重新设计,内置大量加强的网页端难以实现的功能,且提供了相当的自定义选项。 目前已适配 iOS / Android / WSA、mobile / 简单 pad、light / dark theme、移动端网页。

Language: TypeScript - Size: 90.5 MB - Last synced at: 3 days ago - Pushed at: 3 days ago - Stars: 4,340 - Forks: 145

Fsoft-AIC/CompeteSMoE

CompeteSMoE - Statistically Guaranteed Mixture of Experts Training via Competition

Language: Python - Size: 0 Bytes - Last synced at: 3 days ago - Pushed at: 3 days ago - Stars: 1 - Forks: 0

sgl-project/sglang

SGLang is a fast serving framework for large language models and vision language models.

Language: Python - Size: 19.3 MB - Last synced at: 4 days ago - Pushed at: 4 days ago - Stars: 14,463 - Forks: 1,783

hiyouga/LLaMA-Factory

Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)

Language: Python - Size: 47.1 MB - Last synced at: 4 days ago - Pushed at: 4 days ago - Stars: 49,191 - Forks: 5,987

OpenSparseLLMs/LLaMA-MoE-v2

🚀 LLaMA-MoE v2: Exploring Sparsity of LLaMA from Perspective of Mixture-of-Experts with Post-Training

Language: Python - Size: 2.21 MB - Last synced at: 4 days ago - Pushed at: 6 months ago - Stars: 84 - Forks: 12

mindspore-courses/step_into_llm

MindSpore online courses: Step into LLM

Language: Jupyter Notebook - Size: 246 MB - Last synced at: 1 day ago - Pushed at: 5 months ago - Stars: 467 - Forks: 117

kyegomez/LIMoE

Implementation of the "the first large-scale multimodal mixture of experts models." from the paper: "Multimodal Contrastive Learning with LIMoE: the Language-Image Mixture of Experts"

Language: Python - Size: 2.17 MB - Last synced at: 5 days ago - Pushed at: about 2 months ago - Stars: 28 - Forks: 2

kyegomez/SwitchTransformers

Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity"

Language: Python - Size: 2.42 MB - Last synced at: 1 day ago - Pushed at: about 2 months ago - Stars: 101 - Forks: 12

kyegomez/MoE-Mamba

Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Zeta

Language: Python - Size: 2.17 MB - Last synced at: 6 days ago - Pushed at: about 2 months ago - Stars: 103 - Forks: 5

cvyl/short.moe

Short.moe is a free URL shortener service that allows you to easily shorten long URLs into shorter, more manageable links.

Language: TypeScript - Size: 554 KB - Last synced at: 4 days ago - Pushed at: 4 days ago - Stars: 4 - Forks: 1

bcgov/nr-epd-digital-services

EPD Suite of Applications

Language: TypeScript - Size: 21.4 MB - Last synced at: 3 days ago - Pushed at: 3 days ago - Stars: 4 - Forks: 2

ymcui/Chinese-Mixtral

中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)

Language: Python - Size: 519 KB - Last synced at: 6 days ago - Pushed at: about 1 year ago - Stars: 604 - Forks: 44

0-5788719150923125/praxis

as above, so below

Language: Python - Size: 7.16 MB - Last synced at: 9 days ago - Pushed at: 9 days ago - Stars: 13 - Forks: 1

inferflow/inferflow

Inferflow is an efficient and highly configurable inference engine for large language models (LLMs).

Language: C++ - Size: 1.89 MB - Last synced at: 4 days ago - Pushed at: about 1 year ago - Stars: 242 - Forks: 25

bcgov/reserve-rec-admin

For the Parks and Recreation Digital Transformation project.

Language: TypeScript - Size: 2.49 MB - Last synced at: 9 days ago - Pushed at: 9 days ago - Stars: 1 - Forks: 4

open-compass/MixtralKit

A toolkit for inference and evaluation of 'mixtral-8x7b-32kseqlen' from Mistral AI

Language: Python - Size: 79.1 KB - Last synced at: 7 days ago - Pushed at: over 1 year ago - Stars: 767 - Forks: 79

pjlab-sys4nlp/llama-moe

⛷️ LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training (EMNLP 2024)

Language: Python - Size: 1.69 MB - Last synced at: 1 day ago - Pushed at: 6 months ago - Stars: 961 - Forks: 58

inclusionAI/Ling

Ling is a MoE LLM provided and open-sourced by InclusionAI.

Language: Python - Size: 3.36 MB - Last synced at: 9 days ago - Pushed at: 9 days ago - Stars: 152 - Forks: 15

Moebits/Moepictures

Moepictures is an anime image board organized by tags.

Language: TypeScript - Size: 283 MB - Last synced at: 9 days ago - Pushed at: 9 days ago - Stars: 26 - Forks: 0

bcgov/nr-site-registry

NR EPD SITE REGISTRY

Language: TypeScript - Size: 98.7 MB - Last synced at: 1 day ago - Pushed at: 1 day ago - Stars: 1 - Forks: 0

Fuwn/mayu

⭐ Moe-Counter Compatible Website Hit Counter Written in Gleam

Language: HTML - Size: 23.4 MB - Last synced at: 7 days ago - Pushed at: 11 days ago - Stars: 12 - Forks: 0

LINs-lab/DynMoE

[ICLR 2025] Dynamic Mixture of Experts: An Auto-Tuning Approach for Efficient Transformer Models

Language: Python - Size: 57.3 MB - Last synced at: 12 days ago - Pushed at: 4 months ago - Stars: 89 - Forks: 11

Nekosia-API/nekosia.js

A simple wrapper for the Nekosia API that provides easy access to random anime images. Enhance your projects with the magic of anime and a touch of feline charm meow~~! Discover why switching to Nekosia is the purrfect choice!

Language: JavaScript - Size: 140 KB - Last synced at: 16 days ago - Pushed at: 16 days ago - Stars: 6 - Forks: 0

SkyworkAI/MoH

MoH: Multi-Head Attention as Mixture-of-Head Attention

Language: Python - Size: 5.26 MB - Last synced at: 6 days ago - Pushed at: 7 months ago - Stars: 240 - Forks: 10

LISTEN-moe/android-app

Official LISTEN.moe Android app

Language: Kotlin - Size: 33.4 MB - Last synced at: 3 days ago - Pushed at: 3 days ago - Stars: 263 - Forks: 25

Fsoft-AIC/LibMoE

LibMoE: A LIBRARY FOR COMPREHENSIVE BENCHMARKING MIXTURE OF EXPERTS IN LARGE LANGUAGE MODELS

Language: Python - Size: 7.24 MB - Last synced at: 21 days ago - Pushed at: 21 days ago - Stars: 37 - Forks: 0

SuperBruceJia/Awesome-Mixture-of-Experts

Awesome Mixture of Experts (MoE): A Curated List of Mixture of Experts (MoE) and Mixture of Multimodal Experts (MoME)

Size: 438 KB - Last synced at: 22 days ago - Pushed at: 4 months ago - Stars: 27 - Forks: 3

bcgov/nr-epd-organics-info

Source Code and Artifacts Related to Organics Info

Language: TypeScript - Size: 16 MB - Last synced at: 13 days ago - Pushed at: 13 days ago - Stars: 4 - Forks: 1

ai-bot-pro/baby-llm

Language: Python - Size: 11.6 MB - Last synced at: 1 day ago - Pushed at: 4 months ago - Stars: 4 - Forks: 0

ZhenbangDu/DSD

[IEEE TAI] Mixture-of-Experts for Open Set Domain Adaptation: A Dual-Space Detection Approach

Language: Python - Size: 643 KB - Last synced at: 26 days ago - Pushed at: 26 days ago - Stars: 5 - Forks: 0

haxpor/blockbunny

Libgdx-based game for Android, iOS, and PC following the tutorial by ForeignGuyMike on youtube channel. Read more on README.md

Language: Kotlin - Size: 3.22 MB - Last synced at: 9 days ago - Pushed at: almost 8 years ago - Stars: 28 - Forks: 14

IBM/ModuleFormer

ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.

Language: Python - Size: 71.3 KB - Last synced at: 16 days ago - Pushed at: about 1 year ago - Stars: 220 - Forks: 11

xrsrke/pipegoose

Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*

Language: Python - Size: 1.26 MB - Last synced at: 27 days ago - Pushed at: over 1 year ago - Stars: 82 - Forks: 18

mifjpn/Moebuntu-kantan-Setup2

Moebuntu-SetupHelperScript2 Japanese version

Language: Shell - Size: 318 KB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 15 - Forks: 1

mifjpn/Moebuntu-SetupHelperScript2

Theme changer for UBUNTU24.04 to kawaii(MOE)

Language: Shell - Size: 270 KB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 45 - Forks: 0

FareedKhan-dev/train-llama4

Building LLaMA 4 MoE from Scratch

Language: Jupyter Notebook - Size: 7.8 MB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 2 - Forks: 0

VITA-Group/Random-MoE-as-Dropout

[ICLR 2023] "Sparse MoE as the New Dropout: Scaling Dense and Self-Slimmable Transformers" by Tianlong Chen*, Zhenyu Zhang*, Ajay Jaiswal, Shiwei Liu, Zhangyang Wang

Language: Python - Size: 686 KB - Last synced at: about 1 month ago - Pushed at: about 2 years ago - Stars: 50 - Forks: 2

ye-lili/hainan-acgn-culture

The rise and development of ACGN culture in Hainan: From underground emergence to a new chapter in the Free Trade Port

Size: 5.86 KB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 0 - Forks: 0

LemonAttn/mini_transformer

最小Transformer架构,能够快速搭建现在各种Transformer架构模型

Language: Python - Size: 389 KB - Last synced at: 11 days ago - Pushed at: about 1 month ago - Stars: 0 - Forks: 0

james-oldfield/muMoE

[NeurIPS'24] Multilinear Mixture of Experts: Scalable Expert Specialization through Factorization

Language: Python - Size: 2.95 MB - Last synced at: about 1 month ago - Pushed at: 8 months ago - Stars: 30 - Forks: 1

CheapNightbot/moe

Only Discord Bot You Need (i guess...) ~ 萌え萌えキュン ♡ (⸝⸝> ᴗ•⸝⸝)

Language: Python - Size: 14.3 MB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 0 - Forks: 0

bcgov/nr-soils-relocation

Language: Python - Size: 702 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 0 - Forks: 2

Agora-Lab-AI/HydraNet

HydraNet is a state-of-the-art transformer architecture that combines Multi-Query Attention (MQA), Mixture of Experts (MoE), and continuous learning capabilities.

Language: Shell - Size: 2.16 MB - Last synced at: about 1 month ago - Pushed at: about 1 month ago - Stars: 5 - Forks: 0

louisbrulenaudet/mergeKit

Tools for merging pretrained Large Language Models and create Mixture of Experts (MoE) from open-source models.

Language: Jupyter Notebook - Size: 13.7 KB - Last synced at: about 1 month ago - Pushed at: over 1 year ago - Stars: 8 - Forks: 0

kyegomez/MHMoE

Community Implementation of the paper: "Multi-Head Mixture-of-Experts" In PyTorch

Language: Python - Size: 2.16 MB - Last synced at: 16 days ago - Pushed at: about 2 months ago - Stars: 24 - Forks: 4

kokororin/pixiv.moe

😘 A pinterest-style layout site, shows illusts on pixiv.net order by popularity.

Language: TypeScript - Size: 16.6 MB - Last synced at: 5 days ago - Pushed at: about 2 years ago - Stars: 364 - Forks: 43

sefinek/moecounter.js

The most effective and efficient moecounters for your projects, designed to display a wide range of statistics for your website and more!

Language: JavaScript - Size: 775 KB - Last synced at: 3 days ago - Pushed at: 7 months ago - Stars: 8 - Forks: 1

facebookresearch/AdaTT

pytorch open-source library for the paper "AdaTT Adaptive Task-to-Task Fusion Network for Multitask Learning in Recommendations"

Language: Python - Size: 18.6 KB - Last synced at: about 2 months ago - Pushed at: 10 months ago - Stars: 48 - Forks: 8

simplifine-llm/Simplifine

🚀 Easy, open-source LLM finetuning with one-line commands, seamless cloud integration, and popular optimization frameworks. ✨

Language: Python - Size: 844 KB - Last synced at: 16 days ago - Pushed at: 9 months ago - Stars: 90 - Forks: 4

whucs21Mzy/Model-Hemorrhage

Model Hemorrhage and the Robustness Limits of Large Language Models: A Perspective

Size: 775 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 1 - Forks: 0

rabiloo/llm-finetuning

Sample for Fine-Tuning LLMs & VLMs

Language: Python - Size: 274 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 1 - Forks: 2

dannyxiaocn/awesome-moe

a repo for moe papers and systems aggregation

Size: 474 KB - Last synced at: 15 days ago - Pushed at: over 3 years ago - Stars: 7 - Forks: 2

kravetsone/enkaNetwork

Node JS enka.network API wrapper written on TypeScript which provides localization, caching and convenience.

Language: TypeScript - Size: 1.47 MB - Last synced at: 10 days ago - Pushed at: about 2 months ago - Stars: 18 - Forks: 5

sail-sg/Adan

Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models

Language: Python - Size: 1.3 MB - Last synced at: about 2 months ago - Pushed at: 11 months ago - Stars: 784 - Forks: 67

mifjpn/Moebuntu-kawaiiUbuntu-ToysOriginal

Moebuntu-kawaiiUbuntu-ToysOriginal

Size: 103 MB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 5 - Forks: 0

JunweiZheng93/MATERobot

Official repository for paper "MATERobot: Material Recognition in Wearable Robotics for People with Visual Impairments" at ICRA 2024, Best Paper Finalist on Human-Robot Interaction

Language: Python - Size: 302 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 13 - Forks: 0

OpenSparseLLMs/CLIP-MoE

CLIP-MoE: Mixture of Experts for CLIP

Language: Python - Size: 2.35 MB - Last synced at: about 2 months ago - Pushed at: 8 months ago - Stars: 29 - Forks: 0

shalldie/chuncai

A lovely Page Wizard, is responsible for selling moe.

Language: TypeScript - Size: 236 KB - Last synced at: 8 days ago - Pushed at: almost 7 years ago - Stars: 112 - Forks: 34

AdamG012/moe-paper-models

A sumary of MoE experimental setups across a number of different papers.

Size: 10.7 KB - Last synced at: 17 days ago - Pushed at: over 2 years ago - Stars: 16 - Forks: 1

naidezhujimo/YingHub-v2-A-Sparse-MoE-Language-Model

YingHub-v2 is an advanced language model built upon the Sparse Mixture of Experts (MoE) architecture. It leverages dynamic routing mechanisms, expert load balancing.incorporating state-of-the-art training and optimization strategies.

Language: Python - Size: 768 KB - Last synced at: about 2 months ago - Pushed at: 2 months ago - Stars: 1 - Forks: 0

sergree/awesome-visual-novels

🍙 A curated list of my favorite visual novels for Android

Size: 72.3 KB - Last synced at: 16 days ago - Pushed at: 5 months ago - Stars: 5 - Forks: 0

naidezhujimo/Sparse-MoE-Language-Model-v1

This repository contains an implementation of a Sparse Mixture of Experts (MoE) Language Model using PyTorch. The model is designed to handle large-scale text generation tasks efficiently by leveraging multiple expert networks and a routing mechanism to dynamically select the most relevant experts for each input.

Language: Python - Size: 869 KB - Last synced at: about 2 months ago - Pushed at: 2 months ago - Stars: 1 - Forks: 0

libgdx/gdx-pay

A libGDX cross-platform API for InApp purchasing.

Language: Java - Size: 15.8 MB - Last synced at: about 2 months ago - Pushed at: 5 months ago - Stars: 225 - Forks: 86

przemub/anime_quiz

Anime Themes Quiz for people with taste.

Language: Python - Size: 745 KB - Last synced at: 10 days ago - Pushed at: 3 months ago - Stars: 2 - Forks: 0

jezreelbarbosa/NekosiaAPI

A simple wrapper made in swift for Nekosia API

Language: Swift - Size: 35.2 KB - Last synced at: 3 months ago - Pushed at: 3 months ago - Stars: 2 - Forks: 0

kyegomez/Mixture-of-MQA

An implementation of a switch transformer like Multi-query attention model

Language: Python - Size: 0 Bytes - Last synced at: 3 months ago - Pushed at: 3 months ago - Stars: 1 - Forks: 0

phanirithvij/twist.moe 📦

Batch download high quality videos from https://twist.moe

Language: Python - Size: 23.1 MB - Last synced at: 7 days ago - Pushed at: over 1 year ago - Stars: 74 - Forks: 15

MoeFE/MoeUI

UI components Library with Vue.js (Moe is Justice!!!)

Language: CSS - Size: 354 KB - Last synced at: 21 days ago - Pushed at: almost 6 years ago - Stars: 30 - Forks: 3

1834423612/Moe-counter-PHP

萌萌的网页访客计数器 PHP + Mysql版

Language: PHP - Size: 637 KB - Last synced at: about 2 months ago - Pushed at: over 2 years ago - Stars: 17 - Forks: 3

tobychui/Weather-Pet-Display

A simple weather display with a cute interactive desktop pet (❛◡❛✿)

Language: C++ - Size: 55.1 MB - Last synced at: 29 days ago - Pushed at: almost 3 years ago - Stars: 14 - Forks: 1

lethanhvinh0604/Guess-Moe-Number

Game đơn giản với JavaScript

Language: JavaScript - Size: 23.7 MB - Last synced at: about 2 months ago - Pushed at: 6 months ago - Stars: 0 - Forks: 0

s-chh/PyTorch-Scratch-LLM

Simple and easy to understand PyTorch implementation of Large Language Model (LLM) GPT and LLAMA from scratch with detailed steps. Implemented: Byte-Pair Tokenizer, Rotational Positional Embedding (RoPe), SwishGLU, RMSNorm, Mixture of Experts (MOE). Tested on Taylor Swift song lyrics dataset.

Language: Python - Size: 58.6 KB - Last synced at: 6 months ago - Pushed at: 6 months ago - Stars: 2 - Forks: 0

Popcorn-moe/Web 📦

Popcorn.moe Web

Language: Vue - Size: 4.11 MB - Last synced at: 5 months ago - Pushed at: over 6 years ago - Stars: 13 - Forks: 4

ednialzavlare/MixKABRN

This is the repo for the MixKABRN Neural Network (Mixture of Kolmogorov-Arnold Bit Retentive Networks), and an attempt at first adapting it for training on text, and later adjust it for other modalities.

Language: Python - Size: 85.9 KB - Last synced at: 4 months ago - Pushed at: about 1 year ago - Stars: 4 - Forks: 0

calpa/atom-kancolle

Notification using fleet girls' voice.

Language: JavaScript - Size: 17.6 KB - Last synced at: about 2 months ago - Pushed at: almost 3 years ago - Stars: 1 - Forks: 0

ZhenbangDu/Seizure_MoE

The official code for the paper 'Mixture of Experts for EEG-Based Seizure Subtype Classification'.

Language: Python - Size: 150 KB - Last synced at: 9 months ago - Pushed at: 9 months ago - Stars: 5 - Forks: 0

thc1006/youngfly

111年教育部青年署 Young飛全球行動計畫團隊 -語您童行Tai-Gi | 開源資料

Size: 43.1 MB - Last synced at: 10 months ago - Pushed at: 10 months ago - Stars: 1 - Forks: 0

AllenHW/JAX-MoE

A reference implementation of MoE LLM in Jax and Haiku

Language: Python - Size: 29.3 KB - Last synced at: 11 months ago - Pushed at: 11 months ago - Stars: 1 - Forks: 0

yvonwin/qwen2.cpp

qwen2 and llama3 cpp implementation

Language: C++ - Size: 2.04 MB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 18 - Forks: 0

luxizhizhong/mm

"💇" 🚡power by @Electron🍎

Language: CSS - Size: 5.04 MB - Last synced at: 12 months ago - Pushed at: about 6 years ago - Stars: 6 - Forks: 2

marisukukise/japReader

japReader is an app for breaking down Japanese sentences and tracking vocabulary progress

Language: JavaScript - Size: 47.3 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 53 - Forks: 3

yo-ru/moe-bot

Meet Moe, a discord bot, written in modern python!

Language: Python - Size: 93.8 KB - Last synced at: about 1 year ago - Pushed at: about 2 years ago - Stars: 1 - Forks: 0

Nekos-moe/website

Language: Vue - Size: 1.21 MB - Last synced at: about 1 year ago - Pushed at: over 4 years ago - Stars: 36 - Forks: 14

Harry-Chen/InfMoE

Inference framework for MoE layers based on TensorRT with Python binding

Language: C++ - Size: 96.7 KB - Last synced at: 7 days ago - Pushed at: almost 4 years ago - Stars: 41 - Forks: 5

davidmrau/mixture-of-experts

PyTorch Re-Implementation of "The Sparsely-Gated Mixture-of-Experts Layer" by Noam Shazeer et al. https://arxiv.org/abs/1701.06538

Language: Python - Size: 73.2 KB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 818 - Forks: 88

whyakari/android_kernel_xiaomi_ginkgo

MoeKernel source for Xiaomi Redmi Note 8/8T source moved to https://github.com/MoeKernel/android_kernel_xiaomi_ginkgo

Language: C - Size: 1.16 GB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 7 - Forks: 2

mrzjy/expert_choice_visualization_for_mixtral

A simple project that help visualize expert router choices for text generation

Language: Python - Size: 2.08 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 0 - Forks: 0

Nekos-moe/api

Language: JavaScript - Size: 602 KB - Last synced at: about 1 year ago - Pushed at: almost 3 years ago - Stars: 35 - Forks: 7

ralic/moe-calculator

Language: Java - Size: 91.5 MB - Last synced at: about 1 year ago - Pushed at: about 8 years ago - Stars: 0 - Forks: 0

louisbrulenaudet/mergekit-assistant

Mergekit Assistant is a cutting-edge toolkit designed for the seamless merging of pre-trained language models. It supports an array of models, offers various merging methods, and optimizes for low-resource environments with both CPU and GPU compatibility.

Size: 13.7 KB - Last synced at: 2 months ago - Pushed at: about 1 year ago - Stars: 1 - Forks: 0

sijinkim/MEPSNet_dev

Restoring Spatially-Heterogeneous Distortions using Mixture of Experts Network (ACCV 2020)

Language: Python - Size: 8.34 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 1 - Forks: 0

FernandoSilvaVera/Trackime

Web application for anime fans

Language: PHP - Size: 85.7 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 7 - Forks: 1

LISTEN-moe/v3-documentation 📦

Language: JavaScript - Size: 5.86 KB - Last synced at: about 1 year ago - Pushed at: over 7 years ago - Stars: 0 - Forks: 0

LISTEN-moe/browser-extension

Official LISTEN.moe browser extension

Language: JavaScript - Size: 2.52 MB - Last synced at: about 1 year ago - Pushed at: almost 2 years ago - Stars: 26 - Forks: 3

terru3/moe-kit

Language: Jupyter Notebook - Size: 16.2 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 2 - Forks: 0