An open API service providing repository metadata for many open source software ecosystems.

GitHub / CASE-Lab-UMD / Unified-MoE-Compression

The official implementation of the paper "Towards Efficient Mixture of Experts: A Holistic Study of Compression Techniques (TMLR)".

JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/CASE-Lab-UMD%2FUnified-MoE-Compression

Stars: 67
Forks: 5
Open issues: 0

License: apache-2.0
Language: Python
Size: 47.1 MB
Dependencies parsed at: Pending

Created at: over 1 year ago
Updated at: 29 days ago
Pushed at: 3 months ago
Last synced at: 26 days ago

Topics: deep-learning, large-language-models, machine-learning, mixture-of-experts, model-compression, natural-language-processing

    Loading...