GitHub topics: distillation-model
dkozlov/awesome-knowledge-distillation
Awesome Knowledge Distillation
Size: 171 KB - Last synced at: 5 days ago - Pushed at: about 2 months ago - Stars: 3,651 - Forks: 509

Agents4Good/MasterChef-AI
Acesse: https://agents4good.github.io/MasterChef-AI/
Language: Jupyter Notebook - Size: 346 KB - Last synced at: 9 days ago - Pushed at: 9 days ago - Stars: 1 - Forks: 0

anarchy-ai/LLM-VM
irresponsible innovation. Try now at https://chat.dev/
Language: Python - Size: 1.74 MB - Last synced at: 9 days ago - Pushed at: 12 months ago - Stars: 488 - Forks: 142

AnanyaP-WDW/PepEmb
A transformer-based masked language model for learning amino acid sequence representations. The model uses self-attention mechanisms with custom gating and incorporates protein features for enhanced sequence understanding. Trained using BERT-style masking on peptide sequences to learn contextual amino acid embeddings.
Language: Python - Size: 47.9 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 2 - Forks: 1

JULITHCH/azure_embedded_lmm
Step-by-step tutorial how to embed content to GPT-4 using Azure Webservices
Size: 124 KB - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 0 - Forks: 0

NiklasSlager/equdist
This repository provides a combination of the bubble-point algorithm and Naphtali-Sandholm algorithm to steadily compute distillation separations with partial condensers
Language: Jupyter Notebook - Size: 959 KB - Last synced at: 3 months ago - Pushed at: 3 months ago - Stars: 0 - Forks: 0

forestry-labs/distillML
An R package providing functions for interpreting and distilling machine learning models
Language: R - Size: 9.76 MB - Last synced at: 6 days ago - Pushed at: about 2 years ago - Stars: 7 - Forks: 2

illidanlab/ABD
[ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
Language: Python - Size: 145 KB - Last synced at: 10 months ago - Pushed at: 10 months ago - Stars: 22 - Forks: 1

yk7333/DRND
[ICML 2024]Exploration and Anti-exploration with Distributional Random Network Distillation
Language: Python - Size: 141 KB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 1 - Forks: 0

mjmaher987/Robustness---CISPA
CISPA Summer Internship
Language: Jupyter Notebook - Size: 8.57 MB - Last synced at: about 1 year ago - Pushed at: over 1 year ago - Stars: 0 - Forks: 0

szq0214/MEAL-V2
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
Language: Python - Size: 730 KB - Last synced at: 11 months ago - Pushed at: over 3 years ago - Stars: 684 - Forks: 69

daspartho/DistillClassifier
Easily generate synthetic data for classification tasks using LLMs
Language: Python - Size: 1020 KB - Last synced at: 26 days ago - Pushed at: over 1 year ago - Stars: 2 - Forks: 0

Bencosme039498/Distillation-of-a-C2-C7-mixture-
It is envisaged to eliminate these light constituents by distillation (flash or stripping). A preliminary study of the operating conditions of the process can be done in pseudo-binary: we assimilate the C7 cut to n-heptane and the light ones to ethane. We wish to construct the diagrams [T-x-y] and [x-y], [h-x-y] of the ethane-n-heptane binary under a fixed pressure of 13.78 bars.
Language: Python - Size: 1.07 MB - Last synced at: almost 2 years ago - Pushed at: over 2 years ago - Stars: 2 - Forks: 0

Junxiao-Zhao/Distillation_Decision_Tree
Code Reproduction of the essay Distillation Decision Tree
Language: Python - Size: 2.69 MB - Last synced at: about 2 years ago - Pushed at: almost 3 years ago - Stars: 0 - Forks: 0
