Ecosyste.ms: Repos

An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: distillation

hung20gg/kg_llm

Using Knowledge Graph to Query Resume

Language: Jupyter Notebook - Size: 201 KB - Last synced: about 13 hours ago - Pushed: about 15 hours ago - Stars: 1 - Forks: 0

larry-athey/rpi-smart-still

Raspberry PI and Arduino/ESP32 powered smart still controller system. Designed around the Still Spirits T-500 column and boiler, but can be easily added to any other gas or electric still with a dephlegmator.

Language: PHP - Size: 19 MB - Last synced: 11 days ago - Pushed: 12 days ago - Stars: 12 - Forks: 0

gojasper/flash-diffusion

Official implementation of ⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation

Language: Python - Size: 18.4 MB - Last synced: about 20 hours ago - Pushed: 1 day ago - Stars: 114 - Forks: 7

K-bNd/DINOv1_implem

DINOv1 implementation in Pytorch

Language: Python - Size: 156 MB - Last synced: about 13 hours ago - Pushed: 1 day ago - Stars: 0 - Forks: 0

airaria/TextBrewer

A PyTorch-based knowledge distillation toolkit for natural language processing

Language: Python - Size: 7.54 MB - Last synced: 1 day ago - Pushed: about 1 year ago - Stars: 1,565 - Forks: 235

MCG-NJU/AMD

[CVPR 2024] Asymmetric Masked Distillation for Pre-Training Small Foundation Models

Language: Python - Size: 961 KB - Last synced: 5 days ago - Pushed: 5 days ago - Stars: 7 - Forks: 1

JuoTungChen/Adversarial_attacks_DCNN

This repository contains the implementation of three adversarial example attacks including FGSM, noise, semantic attack and a defensive distillation approach to defense against the FGSM attack.

Language: Python - Size: 43.6 MB - Last synced: 5 days ago - Pushed: over 1 year ago - Stars: 1 - Forks: 0

qiangsiwei/bert_distill

BERT distillation(基于BERT的蒸馏实验 )

Language: Python - Size: 28.9 MB - Last synced: 2 days ago - Pushed: almost 4 years ago - Stars: 305 - Forks: 87

huggingface/optimum-intel

🤗 Optimum Intel: Accelerate inference with Intel optimization tools

Language: Jupyter Notebook - Size: 3.08 MB - Last synced: about 2 months ago - Pushed: about 2 months ago - Stars: 316 - Forks: 88

arena-ai/arena

A place to evaluate public models

Language: Python - Size: 768 KB - Last synced: 11 days ago - Pushed: 12 days ago - Stars: 1 - Forks: 0

htqin/awesome-efficient-aigc

A list of papers, docs, codes about efficient AIGC. This repo is aimed to provide the info for efficient AIGC research, including language and vision, we are continuously improving the project. Welcome to PR the works (papers, repositories) that are missed by the repo.

Size: 154 KB - Last synced: 20 days ago - Pushed: about 2 months ago - Stars: 105 - Forks: 10

Tanuki/tanuki.py

Prompt engineering for developers

Language: Python - Size: 774 KB - Last synced: 9 days ago - Pushed: 4 months ago - Stars: 641 - Forks: 23

dkozlov/awesome-knowledge-distillation

Awesome Knowledge Distillation

Size: 282 KB - Last synced: 17 days ago - Pushed: 6 months ago - Stars: 3,328 - Forks: 483

FLHonker/Awesome-Knowledge-Distillation

Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。

Size: 457 KB - Last synced: 17 days ago - Pushed: about 1 year ago - Stars: 2,418 - Forks: 334

AberHu/Knowledge-Distillation-Zoo

Pytorch implementation of various Knowledge Distillation (KD) methods.

Language: Python - Size: 90.8 KB - Last synced: 18 days ago - Pushed: over 2 years ago - Stars: 1,526 - Forks: 261

IntelLabs/distiller 📦

Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller

Language: Jupyter Notebook - Size: 40.5 MB - Last synced: 18 days ago - Pushed: about 1 year ago - Stars: 4,310 - Forks: 797

BioSTEAMDevelopmentGroup/biosteam

The Biorefinery Simulation and Techno-Economic Analysis Modules; Life Cycle Assessment; Chemical Process Simulation Under Uncertainty

Language: Python - Size: 1.41 GB - Last synced: 25 days ago - Pushed: about 1 month ago - Stars: 170 - Forks: 33

anarchy-ai/LLM-VM

irresponsible innovation. Try now at https://chat.dev/

Language: Python - Size: 1.74 MB - Last synced: 26 days ago - Pushed: 26 days ago - Stars: 455 - Forks: 150

Tanuki/tanuki.ts

Prompt engineering for developers

Language: TypeScript - Size: 1.69 MB - Last synced: 27 days ago - Pushed: 4 months ago - Stars: 0 - Forks: 0

leondgarse/Keras_insightface

Insightface Keras implementation

Language: Python - Size: 46.1 MB - Last synced: 27 days ago - Pushed: 27 days ago - Stars: 223 - Forks: 54

Atenrev/diffusion_continual_learning

PyTorch implementation of various distillation approaches for continual learning of Diffusion Models.

Language: Python - Size: 78.8 MB - Last synced: 25 days ago - Pushed: 2 months ago - Stars: 13 - Forks: 0

YangLing0818/VQGraph

[ICLR 2024] VQGraph: Rethinking Graph Representation Space for Bridging GNNs and MLPs

Language: Python - Size: 10.3 MB - Last synced: 26 days ago - Pushed: 3 months ago - Stars: 56 - Forks: 6

NVIDIA-AI-IOT/clip-distillation

Zero-label image classification via OpenCLIP knowledge distillation

Language: Python - Size: 2.03 MB - Last synced: 26 days ago - Pushed: 9 months ago - Stars: 92 - Forks: 14

sungnyun/ARMHuBERT

(Interspeech 2023 & ICASSP 2024) Official repository for ARMHuBERT and STaRHuBERT

Language: Python - Size: 4.51 MB - Last synced: 26 days ago - Pushed: about 1 month ago - Stars: 31 - Forks: 4

khurramjaved96/incremental-learning

Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."

Language: Python - Size: 2.15 MB - Last synced: about 1 month ago - Pushed: about 1 month ago - Stars: 106 - Forks: 24

vscherbo/distibot

Distibot (DISTIller roBOT) is a Python program for Raspberry Pi (Raspbian) to control a whole process of distillation

Language: Python - Size: 5.01 MB - Last synced: about 1 month ago - Pushed: about 1 month ago - Stars: 3 - Forks: 3

ViTAE-Transformer/SimDistill

The official repo for [AAAI 2024] "SimDistill: Simulated Multi-modal Distillation for BEV 3D Object Detection""

Size: 8.68 MB - Last synced: 24 days ago - Pushed: 25 days ago - Stars: 20 - Forks: 1

szq0214/FKD

Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"

Language: Python - Size: 1.57 MB - Last synced: about 1 month ago - Pushed: about 1 month ago - Stars: 176 - Forks: 31

newfull5/NLLB-200-Distilled-350M-en-ko

nllb-200 distilled 350M for English to Korean translation

Language: Jupyter Notebook - Size: 30.3 KB - Last synced: about 1 month ago - Pushed: about 1 month ago - Stars: 1 - Forks: 0

chenllliang/Model-Compression-For-Speaker-Recognition

Distillation examples. Trying to make Speaker Recognition Faster through different Model Compression techniques

Language: Python - Size: 12.7 KB - Last synced: about 1 month ago - Pushed: almost 4 years ago - Stars: 0 - Forks: 1

gyunggyung/AGI-Papers

Papers and Book to look at when starting AGI 📚

Size: 35.6 MB - Last synced: about 2 months ago - Pushed: about 2 months ago - Stars: 243 - Forks: 32

microsoft/augmented-interpretable-models

Interpretable and efficient predictors using pre-trained language models. Scikit-learn compatible.

Language: Jupyter Notebook - Size: 189 MB - Last synced: about 1 month ago - Pushed: about 2 months ago - Stars: 37 - Forks: 8

Yu-Group/adaptive-wavelets

Adaptive, interpretable wavelets across domains (NeurIPS 2021)

Language: Jupyter Notebook - Size: 268 MB - Last synced: 8 days ago - Pushed: over 2 years ago - Stars: 68 - Forks: 10

PaddlePaddle/PaddleSlim

PaddleSlim is an open-source library for deep model compression and architecture search.

Language: Python - Size: 16.3 MB - Last synced: about 2 months ago - Pushed: 2 months ago - Stars: 1,514 - Forks: 347

GMvandeVen/continual-learning

PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.

Language: Jupyter Notebook - Size: 3.14 MB - Last synced: about 2 months ago - Pushed: 3 months ago - Stars: 1,442 - Forks: 301

aioz-ai/LDR_ALDK

Light-weight Deformable Registration using Adversarial Learning with Distilling Knowledge (IEEE Transactions on Medical Imaging 2021))

Language: Python - Size: 62.5 KB - Last synced: 10 days ago - Pushed: over 2 years ago - Stars: 17 - Forks: 2

sfarhat/dapt

Code for "On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models"

Language: Python - Size: 37.1 KB - Last synced: 2 months ago - Pushed: 2 months ago - Stars: 2 - Forks: 0

ZJLAB-AMMI/LLM4Teach

Python code to implement LLM4Teach, a policy distillation approach for teaching reinforcement learning agents with Large Language Model

Language: Python - Size: 40 KB - Last synced: about 2 months ago - Pushed: about 2 months ago - Stars: 6 - Forks: 2

ViTAE-Transformer/ViTPose

The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"

Language: Python - Size: 10.5 MB - Last synced: 2 months ago - Pushed: 7 months ago - Stars: 1,145 - Forks: 156

Nota-NetsPresso/BK-SDM

A Compressed Stable Diffusion for Efficient Text-to-Image Generation [ICCV'23 Demo] [ICML'23 Workshop]

Language: Python - Size: 103 KB - Last synced: 2 months ago - Pushed: 3 months ago - Stars: 163 - Forks: 12

apguilherme/Distillation

Chemical Engineering application: Distillation calculator for McCabe-Thiele and Ponchon-Savarit methods. https://apguilherme.github.io/Distillation/

Language: HTML - Size: 632 KB - Last synced: about 2 months ago - Pushed: almost 3 years ago - Stars: 6 - Forks: 0

dotchen/LAV

(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.

Language: Python - Size: 67.6 MB - Last synced: 2 months ago - Pushed: over 1 year ago - Stars: 377 - Forks: 61

FLHonker/ZAQ-code

CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)

Language: Python - Size: 188 KB - Last synced: about 2 months ago - Pushed: over 2 years ago - Stars: 64 - Forks: 16

briancpark/csc791-025

Computer Science 791-025: Real-Time AI & High-Performance Machine Learning

Language: TeX - Size: 77 MB - Last synced: about 1 month ago - Pushed: 4 months ago - Stars: 1 - Forks: 0

anirudhb11/LEVER

Official Code Base for ICLR 2024 paper Enhancing Tail Performance in Extreme Classifiers by Label Variance Reduction

Language: Python - Size: 294 KB - Last synced: 3 months ago - Pushed: 3 months ago - Stars: 0 - Forks: 0

szq0214/MEAL-V2

MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.

Language: Python - Size: 730 KB - Last synced: 18 days ago - Pushed: over 2 years ago - Stars: 684 - Forks: 69

bloomberg/minilmv2.bb

Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)

Language: Python - Size: 30.3 KB - Last synced: about 2 months ago - Pushed: 12 months ago - Stars: 60 - Forks: 6

segmind/distill-sd

Segmind Distilled diffusion

Language: Python - Size: 4.14 MB - Last synced: 3 months ago - Pushed: 8 months ago - Stars: 494 - Forks: 32

qcraftai/distill-bev

DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation (ICCV 2023)

Language: Python - Size: 22 MB - Last synced: 3 months ago - Pushed: 7 months ago - Stars: 62 - Forks: 3

elephantmipt/bert-distillation

Distillation of BERT model with catalyst framework

Language: Python - Size: 223 KB - Last synced: 3 months ago - Pushed: 12 months ago - Stars: 73 - Forks: 6

Syencil/mobile-yolov5-pruning-distillation

mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!

Language: Jupyter Notebook - Size: 19.4 MB - Last synced: 3 months ago - Pushed: over 1 year ago - Stars: 793 - Forks: 165

tim-learn/DINE

code for our CVPR 2022 paper "DINE: Domain Adaptation from Single and Multiple Black-box Predictors"

Language: Python - Size: 740 KB - Last synced: 4 months ago - Pushed: 4 months ago - Stars: 74 - Forks: 9

JulesBelveze/bert-squeeze

🛠️ Tools for Transformers compression using PyTorch Lightning ⚡

Language: Python - Size: 2.41 MB - Last synced: 26 days ago - Pushed: 3 months ago - Stars: 78 - Forks: 10

Sharpiless/Yolov5-distillation-train-inference

Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据

Language: Python - Size: 2.36 MB - Last synced: 3 months ago - Pushed: over 1 year ago - Stars: 193 - Forks: 31

microsoft/Lightweight-Low-Resource-NMT

Official code for "Too Brittle To Touch: Comparing the Stability of Quantization and Distillation Towards Developing Lightweight Low-Resource MT Models" to appear in WMT 2022.

Language: Python - Size: 45.9 KB - Last synced: 2 months ago - Pushed: 8 months ago - Stars: 16 - Forks: 3

Adamdad/KnowledgeFactor

[ECCV2022] Factorizing Knowledge in Neural Networks

Language: Python - Size: 781 KB - Last synced: 26 days ago - Pushed: over 1 year ago - Stars: 79 - Forks: 5

thu-ml/ares

A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.

Language: Python - Size: 378 MB - Last synced: 5 months ago - Pushed: 8 months ago - Stars: 439 - Forks: 87

qlan3/MeDQN

The official implementation of MeDQN algorithm.

Language: Python - Size: 312 KB - Last synced: 4 months ago - Pushed: 4 months ago - Stars: 9 - Forks: 0

jorgecote/DIstill-column

Simulation model of a binary distillation column for a water-ethanol mix

Language: MATLAB - Size: 15.6 KB - Last synced: 5 months ago - Pushed: 5 months ago - Stars: 3 - Forks: 1

quickjkee/instruct-pix2pix-distill

InstructPix2Pix with distilled diffusion models

Size: 4.18 MB - Last synced: 5 months ago - Pushed: 5 months ago - Stars: 0 - Forks: 0

yukimasano/single-img-extrapolating

Repo for the paper "Extrapolating from a Single Image to a Thousand Classes using Distillation"

Language: Python - Size: 66.2 MB - Last synced: 19 days ago - Pushed: over 1 year ago - Stars: 35 - Forks: 3

alldbi/SuperMix

Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation

Language: Python - Size: 1.86 MB - Last synced: 27 days ago - Pushed: over 2 years ago - Stars: 89 - Forks: 21

Z7zuqer/model-compression-and-acceleration-4-DNN

model-compression-and-acceleration-4-DNN

Size: 67.7 MB - Last synced: 8 days ago - Pushed: over 5 years ago - Stars: 21 - Forks: 4

tangxyw/RecSysPapers

推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.

Language: Python - Size: 1.4 GB - Last synced: 6 months ago - Pushed: 6 months ago - Stars: 782 - Forks: 124

THUKElab/MESED

[AAAI 2024] MESED: A Multi-modal Entity Set Expansion Dataset with Fine-grained Semantic Classes and Hard Negative Entities

Language: Python - Size: 12.7 KB - Last synced: 6 months ago - Pushed: 6 months ago - Stars: 1 - Forks: 0

autodistill/autodistill-base-model-template

A template for use in creating Autodistill Base Model packages.

Language: Python - Size: 7.81 KB - Last synced: 5 months ago - Pushed: 5 months ago - Stars: 3 - Forks: 0

daspartho/DistillClassifier

Easily generate synthetic data for classification tasks using LLMs

Language: Python - Size: 1020 KB - Last synced: about 2 months ago - Pushed: 7 months ago - Stars: 2 - Forks: 0

yonseivnl/se-cff

Official implementation of "Stereo Depth from Events Cameras: Concentrate and Focus on the Future" (CVPR 2022)

Language: Python - Size: 75.2 KB - Last synced: 6 months ago - Pushed: over 1 year ago - Stars: 30 - Forks: 5

dotchen/WorldOnRails

(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model

Language: Python - Size: 7.62 MB - Last synced: 7 months ago - Pushed: over 2 years ago - Stars: 146 - Forks: 28

aitorlucasc/uhc_distillation

Final thesis of the Msc in Data Science from University of Barcelona

Language: Jupyter Notebook - Size: 1.2 MB - Last synced: 7 months ago - Pushed: almost 3 years ago - Stars: 2 - Forks: 1

snap-research/R2L

[ECCV 2022] R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis

Language: Python - Size: 71.2 MB - Last synced: 7 months ago - Pushed: 10 months ago - Stars: 173 - Forks: 21

Zhen-Dong/HAWQ

Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.

Language: Python - Size: 691 KB - Last synced: 7 months ago - Pushed: about 1 year ago - Stars: 361 - Forks: 80

monologg/DistilKoBERT

Distillation of KoBERT from SKTBrain (Lightweight KoBERT)

Language: Python - Size: 500 KB - Last synced: 7 months ago - Pushed: 9 months ago - Stars: 174 - Forks: 23

geyingli/unif

基于 Tensorflow,仿 Scikit-Learn 设计的深度学习自然语言处理框架。支持 40 余种模型类,涵盖语言模型、文本分类、NER、MRC、知识蒸馏等各个领域

Language: Python - Size: 6.28 MB - Last synced: 7 months ago - Pushed: about 1 year ago - Stars: 110 - Forks: 29

HoyTta0/KnowledgeDistillation

Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。

Language: Python - Size: 2.05 MB - Last synced: 7 months ago - Pushed: almost 2 years ago - Stars: 191 - Forks: 49

GMvandeVen/brain-inspired-replay

A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).

Language: Python - Size: 24.5 MB - Last synced: 8 months ago - Pushed: 11 months ago - Stars: 201 - Forks: 59

xiongma/roberta-wwm-base-distill 📦

this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large

Language: Python - Size: 90.8 KB - Last synced: 8 months ago - Pushed: about 4 years ago - Stars: 64 - Forks: 11

CLUEbenchmark/CLUEPretrainedModels

高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型

Language: Python - Size: 789 KB - Last synced: 8 months ago - Pushed: almost 4 years ago - Stars: 765 - Forks: 94

DaraSamii/py-distillation

is a package for solving common distillation problems

Language: Jupyter Notebook - Size: 926 KB - Last synced: 30 days ago - Pushed: almost 3 years ago - Stars: 3 - Forks: 0

JunhoKim94/TutorKD

Tutoring Helps Students Learn Better: Improving Knowledge Distillation for BERT with Tutor Network

Language: Python - Size: 66.1 MB - Last synced: 8 months ago - Pushed: 8 months ago - Stars: 4 - Forks: 0

WezSieTato/BoolBrewLator

Language: C++ - Size: 5.78 MB - Last synced: 8 months ago - Pushed: 8 months ago - Stars: 0 - Forks: 0

JmfanBU/ReachNNStar

Reachability Analysis Tool of Neural Network Controlled Systems (NNCSs)

Language: C++ - Size: 253 MB - Last synced: 7 months ago - Pushed: about 1 year ago - Stars: 16 - Forks: 4

samuelstanton/gnosis

Code to reproduce experiments from 'Does Knowledge Distillation Really Work' a paper which appeared in the 2021 NeurIPS proceedings.

Language: Python - Size: 302 MB - Last synced: 9 months ago - Pushed: 9 months ago - Stars: 29 - Forks: 4

MostHumble/Dzarabert

A tutorial on how to prune the embedding layer of a language model and crafting a suitable tokenizer

Language: Jupyter Notebook - Size: 429 KB - Last synced: 9 months ago - Pushed: 9 months ago - Stars: 0 - Forks: 0

snap-research/linkless-link-prediction

[ICML 2023] Linkless Link Prediction via Relational Distillation

Language: Python - Size: 184 KB - Last synced: 8 months ago - Pushed: 8 months ago - Stars: 7 - Forks: 1

SinaGhanbarii/Flash-Distillation

MATLAB program that models a binary flash distillation column by calculating vapor-liquid equilibrium using the Antoine equation. Determines liquid and vapor product flow rates, compositions, temperatures based on given feed conditions like pressure, temperature, composition. Plots a T-x-y diagram.

Language: MATLAB - Size: 12.7 KB - Last synced: 6 months ago - Pushed: 9 months ago - Stars: 0 - Forks: 0

twinkle0331/LGTM

[ACL 2023] Code for paper “Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation”(https://arxiv.org/abs/2305.09651)

Language: Python - Size: 536 KB - Last synced: 8 months ago - Pushed: about 1 year ago - Stars: 32 - Forks: 2

dermatologist/distilling-step-by-step Fork of google-research/distilling-step-by-step

This is a fork of the distilling-step-by-step repository with the aim of creating a task-specific LLM distillation framework for healthcare.

Language: Python - Size: 76.1 MB - Last synced: 5 months ago - Pushed: 5 months ago - Stars: 0 - Forks: 0

tonywu71/distilling-and-forgetting-in-large-pre-trained-models

Code for my dissertation on "Distilling and Forgetting in Large Pre-Trained Models" for the MPhil in Machine Learning and Machine Intelligence (MLMI) at the University of Cambridge.

Language: Jupyter Notebook - Size: 11.3 MB - Last synced: 8 months ago - Pushed: 8 months ago - Stars: 1 - Forks: 0

LazerLambda/Team09AppliedDL

Model Distillation for Unlabeled and Imbalanced Data for Amino-Acid-Strings

Language: Python - Size: 1.67 MB - Last synced: 10 months ago - Pushed: 10 months ago - Stars: 1 - Forks: 0

shriaithal/AlternusVera Fork of aarsanjani/AlternusVera

Alternus Vera Project

Language: Jupyter Notebook - Size: 45.2 MB - Last synced: 10 months ago - Pushed: over 5 years ago - Stars: 0 - Forks: 0

mrpositron/distillation 📦

Self-Distillation and Knowledge Distillation Experiments with PyTorch.

Language: Python - Size: 10.4 MB - Last synced: 10 months ago - Pushed: over 2 years ago - Stars: 7 - Forks: 2

snap-research/graphless-neural-networks

[ICLR 2022] Code for Graph-less Neural Networks: Teaching Old MLPs New Tricks via Distillation (GLNN)

Language: Python - Size: 643 KB - Last synced: 10 months ago - Pushed: over 1 year ago - Stars: 65 - Forks: 17

smsharma/consistency-models

Implementation of Consistency Models (Song et al 2023) in Jax.

Language: Jupyter Notebook - Size: 1.94 MB - Last synced: 10 months ago - Pushed: 12 months ago - Stars: 14 - Forks: 0

csiro-robotics/L3DMC

The official repository for paper MICCAI2023 "L3DMC: Lifelong Learning using Distillation via Mixed-Curvature Space"

Size: 1000 Bytes - Last synced: 11 months ago - Pushed: 11 months ago - Stars: 1 - Forks: 0

fxmeng/filter-grafting

Filter Grafting for Deep Neural Networks(CVPR 2020)

Language: Python - Size: 169 KB - Last synced: 7 months ago - Pushed: over 2 years ago - Stars: 138 - Forks: 23

kyaiooiayk/Cheap-ML-models

Optimising train, inference and throughput of expensive ML models

Size: 19.5 KB - Last synced: 10 months ago - Pushed: 10 months ago - Stars: 0 - Forks: 0

akimach/tensorflow-distillation-examples

Knowledge distillation implemented in TensorFlow

Language: Jupyter Notebook - Size: 8.38 MB - Last synced: 11 months ago - Pushed: almost 7 years ago - Stars: 19 - Forks: 5

jakegrigsby/algorithm_distillation

minimalist pytorch replication of Algorithm Distillation (Laskin et al., 2022)

Language: Python - Size: 1.74 MB - Last synced: 11 months ago - Pushed: 11 months ago - Stars: 2 - Forks: 0

whale-ynu/LMSD Fork of qsw-code/LMSD

【NCA】Learning Metric Space with Distillation for Large-Scale Multi-Label Text Classification

Language: Python - Size: 38.1 KB - Last synced: 11 months ago - Pushed: 11 months ago - Stars: 0 - Forks: 0

Related Keywords
distillation 156 pytorch 35 deep-learning 31 knowledge-distillation 20 bert 15 nlp 15 model-compression 14 quantization 14 pruning 14 python 13 machine-learning 9 computer-vision 8 tensorflow 8 llm 7 transformer 7 knowledge 6 neural-network 6 transformers 6 continual-learning 6 language-model 5 self-supervised-learning 5 classification 5 incremental-learning 5 huggingface 4 stable-diffusion 4 inference 4 distilbert 4 ai 4 large-language-models 4 contrastive-learning 4 distillation-model 4 kd 4 reinforcement-learning 4 diffusion-models 4 text-classification 4 knowledge-transfer 4 deeplearning 3 self-distillation 3 roberta 3 deep-neural-networks 3 lightweight 3 keras 3 lifelong-learning 3 chemical-engineering 3 multi-modal 3 arduino 3 gnn 3 autonomous-driving 3 natural-language-processing 3 compression 3 adversarial-attacks 3 knowldge-distillation 3 graph-neural-networks 3 yolov5 3 cvpr2022 3 efficient-inference 3 speech-recognition 2 artificial-intelligence 2 fgsm 2 tensorflow2 2 rectification 2 pre-trained-model 2 zcls 2 matlab 2 thermodynamics 2 object-detection 2 scalability 2 efficientnet 2 feature-distillation 2 synthetic-data 2 variational-autoencoder 2 replay-through-feedback 2 replay 2 icarl 2 vision-transformer 2 elastic-weight-consolidation 2 chemistry 2 distillation-calculator 2 artificial-neural-networks 2 multilabel-classification 2 javascript 2 segmentation 2 nas 2 mccabe-thiele 2 detection 2 carla-simulator 2 statistics 2 keyword-spotting 2 kws 2 interpretability 2 explainability 2 translation 2 efficient-training 2 inductive-biases 2 3d-object-detection 2 tensorrt 2 sparsity 2 korean 2 clip 2 graph 2