GitHub / sfarhat / dapt
Code for "On the Surprising Efficacy of Distillation as an Alternative to Pre-Training Small Models"
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/sfarhat%2Fdapt
PURL: pkg:github/sfarhat/dapt
Stars: 2
Forks: 0
Open issues: 0
License: mit
Language: Python
Size: 37.1 KB
Dependencies parsed at: Pending
Created at: about 2 years ago
Updated at: over 1 year ago
Pushed at: over 1 year ago
Last synced at: over 1 year ago
Topics: contrastive-learning, distillation, pre-training, small-models, synthetic-data