GitHub topics: self-distillation
Tebmer/Awesome-Knowledge-Distillation-of-LLMs
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
Size: 18.6 MB - Last synced at: 17 days ago - Pushed at: 3 months ago - Stars: 1,015 - Forks: 60

kisonho/magnet
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
Language: Python - Size: 362 KB - Last synced at: 20 days ago - Pushed at: 5 months ago - Stars: 6 - Forks: 0

naver-ai/augsub
[CVPR 2025] Official PyTorch implementation of MaskSub "Masking meets Supervision: A Strong Learning Alliance"
Language: Python - Size: 251 KB - Last synced at: 2 months ago - Pushed at: 2 months ago - Stars: 34 - Forks: 1

emnzn/DINO
Self-distillation with no labels
Language: Jupyter Notebook - Size: 15.8 MB - Last synced at: 2 months ago - Pushed at: 2 months ago - Stars: 1 - Forks: 0

d-f/dino-vit-pcam
Self supervised learning through self distillation with no labels (DINO) with Vision Transformers on the PCAM dataset.
Language: Python - Size: 90.8 KB - Last synced at: about 2 months ago - Pushed at: 4 months ago - Stars: 0 - Forks: 0

sail-sg/sdft
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
Language: Shell - Size: 18.7 MB - Last synced at: 7 months ago - Pushed at: 7 months ago - Stars: 94 - Forks: 4

filipbasara0/simple-ijepa
A simple and efficient implementation of Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture (I-JEPA)
Language: Python - Size: 19.5 KB - Last synced at: about 2 months ago - Pushed at: 9 months ago - Stars: 1 - Forks: 1

youngkyunJang/Deep-Hash-Distillation
Deep Hash Distillation for Image Retrieval - ECCV 2022
Language: Python - Size: 19.4 MB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 41 - Forks: 6

sooperset/boss
Bayesian Optimization Meets Self-Distillation, ICCV 2023
Language: Python - Size: 767 KB - Last synced at: about 2 months ago - Pushed at: over 1 year ago - Stars: 8 - Forks: 1

YeongHyeon/DINO_MNIST-PyTorch
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
Language: Python - Size: 3.33 MB - Last synced at: 28 days ago - Pushed at: over 1 year ago - Stars: 9 - Forks: 1

dongkyunk/DLB-Pytorch
A minimalist unofficial implementation of "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
Language: Python - Size: 5.86 KB - Last synced at: 10 months ago - Pushed at: about 3 years ago - Stars: 8 - Forks: 3

luanyunteng/pytorch-be-your-own-teacher
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Language: Python - Size: 92.1 MB - Last synced at: over 1 year ago - Pushed at: over 3 years ago - Stars: 123 - Forks: 24

marcomoldovan/multimodal-self-distillation
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
Language: Python - Size: 526 KB - Last synced at: almost 2 years ago - Pushed at: almost 2 years ago - Stars: 4 - Forks: 2

mrpositron/distillation 📦
Self-Distillation and Knowledge Distillation Experiments with PyTorch.
Language: Python - Size: 10.4 MB - Last synced at: almost 2 years ago - Pushed at: over 3 years ago - Stars: 7 - Forks: 2

Kennethborup/gaussian_process_self_distillation
Official implementation of Self-Distillation for Gaussian Processes
Language: Python - Size: 279 KB - Last synced at: almost 2 years ago - Pushed at: about 2 years ago - Stars: 3 - Forks: 1

Kennethborup/self_distillation
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
Language: Jupyter Notebook - Size: 1.43 MB - Last synced at: almost 2 years ago - Pushed at: over 3 years ago - Stars: 15 - Forks: 0

youngerous/ddgsd-pytorch
(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
Language: Python - Size: 261 KB - Last synced at: about 2 years ago - Pushed at: about 4 years ago - Stars: 12 - Forks: 3
