GitHub / IntelLabs / distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/IntelLabs%2Fdistiller
PURL: pkg:github/IntelLabs/distiller
Stars: 4,398
Forks: 805
Open issues: 65
License: apache-2.0
Language: Jupyter Notebook
Size: 40.5 MB
Dependencies parsed at: Pending
Created at: over 7 years ago
Updated at: 30 days ago
Pushed at: over 2 years ago
Last synced at: 23 days ago
Topics: automl-for-compression, deep-neural-networks, distillation, early-exit, group-lasso, jupyter-notebook, network-compression, onnx, pruning, pruning-structures, pytorch, quantization, regularization, truncated-svd