GitHub / zjunlp / DART
[ICLR 2022] Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FDART
PURL: pkg:github/zjunlp/DART
Stars: 132
Forks: 17
Open issues: 0
License: mit
Language: Python
Size: 74.2 KB
Dependencies parsed at: Pending
Created at: over 3 years ago
Updated at: about 2 months ago
Pushed at: over 2 years ago
Last synced at: about 1 month ago
Topics: dart, few-shot-learning, iclr, iclr2022, language-models, pre-trained-language-models, prompt, prompt-learning, prompt-tuning, pytorch