An open API service providing repository metadata for many open source software ecosystems.

GitHub topics: parameter-efficient-learning

adapter-hub/adapters

A Unified Library for Parameter-Efficient and Modular Transfer Learning

Language: Jupyter Notebook - Size: 96.9 MB - Last synced at: 1 day ago - Pushed at: 1 day ago - Stars: 2,684 - Forks: 359

HenryHZY/Awesome-Multimodal-LLM

Research Trends in LLM-guided Multimodal Learning.

Size: 17.6 KB - Last synced at: about 14 hours ago - Pushed at: over 1 year ago - Stars: 358 - Forks: 16

huggingface/peft

🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.

Language: Python - Size: 15.9 MB - Last synced at: 6 days ago - Pushed at: 7 days ago - Stars: 18,115 - Forks: 1,819

calpt/awesome-adapter-resources

Collection of Tools and Papers related to Adapters / Parameter-Efficient Transfer Learning/ Fine-Tuning

Language: Python - Size: 213 KB - Last synced at: 3 days ago - Pushed at: 12 months ago - Stars: 189 - Forks: 11

THUDM/P-tuning

A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.

Language: Python - Size: 5.98 MB - Last synced at: about 21 hours ago - Pushed at: over 2 years ago - Stars: 929 - Forks: 111

thunlp/OpenDelta

A plug-and-play library for parameter-efficient-tuning (Delta Tuning)

Language: Python - Size: 42 MB - Last synced at: 6 days ago - Pushed at: 7 months ago - Stars: 1,022 - Forks: 82

ga642381/Speech-Prompts-Adapters

This Repository surveys the paper focusing on Prompting and Adapters for Speech Processing.

Size: 45.9 KB - Last synced at: 3 days ago - Pushed at: over 1 year ago - Stars: 108 - Forks: 5

thunlp/Prompt-Transferability

On Transferability of Prompt Tuning for Natural Language Processing

Language: Python - Size: 629 MB - Last synced at: 17 days ago - Pushed at: 12 months ago - Stars: 99 - Forks: 11

juyongjiang/CodeUp

CodeUp: A Multilingual Code Generation Llama-X Model with Parameter-Efficient Instruction-Tuning

Language: Python - Size: 18.6 MB - Last synced at: 1 day ago - Pushed at: 4 months ago - Stars: 124 - Forks: 9

giaminh03112011/Fine-Tuning

End-to-end fine-tuning of Hugging Face models using LoRA, QLoRA, quantization, and PEFT techniques. Optimized for low-memory with efficient model deployment

Size: 1000 Bytes - Last synced at: about 2 months ago - Pushed at: about 2 months ago - Stars: 0 - Forks: 0

declare-lab/domadapter

Code for EACL'23 paper "Udapter: Efficient Domain Adaptation Using Adapters"

Language: Python - Size: 538 KB - Last synced at: 7 days ago - Pushed at: about 2 years ago - Stars: 10 - Forks: 1

OpenBMB/CPM-Live

Live Training for Open-source Big Models

Language: Python - Size: 1.11 MB - Last synced at: 5 months ago - Pushed at: almost 2 years ago - Stars: 511 - Forks: 40

THUDM/P-tuning-v2

An optimized deep prompt tuning strategy comparable to fine-tuning across scales and tasks

Language: Python - Size: 1.41 MB - Last synced at: 6 months ago - Pushed at: over 1 year ago - Stars: 1,974 - Forks: 201

Paranioar/SHERL

[ECCV2024] The code of "SHERL: Synthesizing High Accuracy and Efficient Memory for Resource-Limited Transfer Learning"

Size: 8.79 KB - Last synced at: 7 months ago - Pushed at: 7 months ago - Stars: 2 - Forks: 0

jianghaojun/Awesome-Parameter-Efficient-Transfer-Learning

A collection of parameter-efficient transfer learning papers focusing on computer vision and multimodal domains.

Size: 165 KB - Last synced at: 9 months ago - Pushed at: 9 months ago - Stars: 372 - Forks: 23

SALT-NLP/Adaptive-Compositional-Modules

Code for the ACL 2022 paper "Continual Sequence Generation with Adaptive Compositional Modules"

Language: Python - Size: 2.85 MB - Last synced at: 3 months ago - Pushed at: about 3 years ago - Stars: 38 - Forks: 2

Srijith-rkr/KAUST-Whisper-Adapter

INTERSPEECH 23 - Refunction Whisper to recognize new tasks with adapters!

Language: Python - Size: 5.26 MB - Last synced at: 11 months ago - Pushed at: over 1 year ago - Stars: 28 - Forks: 2

Paranioar/UniPT

[CVPR2024] The code of "UniPT: Universal Parallel Tuning for Transfer Learning with Efficient Parameter and Memory"

Language: Python - Size: 15.6 MB - Last synced at: 12 months ago - Pushed at: 12 months ago - Stars: 49 - Forks: 0

joaopauloschuler/kEffNetV1

This repository contains the source code for the paper "Grouped Pointwise Convolutions Reduce Parameters in Convolutional Neural Networks".

Language: Jupyter Notebook - Size: 131 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 6 - Forks: 1

joaopauloschuler/k-neural-api

K-CAI NEURAL API - Keras based neural network API that will allow you to create parameter-efficient, memory-efficient, flops-efficient multipath models with new layer types. There are plenty of examples and documentation.

Language: Python - Size: 15.4 MB - Last synced at: about 1 year ago - Pushed at: about 1 year ago - Stars: 130 - Forks: 110

cliang1453/CAMERO

CAMERO: Consistency Regularized Ensemble of Perturbed Language Models with Weight Sharing (ACL 2022)

Language: Python - Size: 230 KB - Last synced at: about 2 years ago - Pushed at: about 3 years ago - Stars: 6 - Forks: 0

LeapLabTHU/Cross-Modal-Adapter

[arXiv] Cross-Modal Adapter for Text-Video Retrieval

Size: 3.39 MB - Last synced at: about 2 years ago - Pushed at: over 2 years ago - Stars: 29 - Forks: 2

Related Keywords
parameter-efficient-learning 22 parameter-efficient-tuning 10 natural-language-processing 6 pretrained-language-model 5 nlp 5 pytorch 5 deep-learning 5 transfer-learning 4 adapter 4 prompt-tuning 3 machine-learning 3 llm 3 transformers 3 lora 3 adapters 3 keras-deep-learning 2 cross-modal 2 keras-models 2 keras-neural-network 2 memory-efficient-learning 2 fine-tuning 2 keras-tutorials 2 parameter-efficient 2 prompt 2 memory-efficient-tuning 2 p-tuning 2 instruction-tuning 2 ensemble-learning 1 knowledge-distillation 1 weight-sharing 1 keras-model 1 keras-mobilenet 1 keras-jupyter-notebook 1 keras-efficientnet 1 clip 1 keras-cnn 1 automatic-speech-recognition 1 sequence-generation 1 video-text-retrieval 1 vision-and-language 1 question-answering 1 lifelong-learning 1 continual-learning 1 keras-visualization 1 keras-tutorial 1 keras-tensorflow 1 keras-neural-networks 1 machine-learning-api 1 keras-layer 1 keras-implementations 1 keras-image-classifier 1 keras-image-augmentation 1 adversarial-perturbations 1 keras-generators 1 keras-examples 1 keras-deep-dream 1 keras-classification-models 1 consistency-regularization 1 distillation 1 dropouts 1 pretrained-models 1 pretrained-language-models 1 speech 1 reprogramming 1 papers 1 awesome-list 1 nlp-library 1 pre-trained-language-models 1 few-shot-learning 1 peft 1 awesome 1 python 1 diffusion 1 multimodal-learning 1 multimodal-large-language-models 1 multimodal 1 large-language-models 1 in-context-learning 1 bert 1 multimodal-deep-learning 1 computer-vision 1 parameter-efficient-fine-tuning 1 natural-language-understanding 1 natural-language-generation 1 multi-task-learning 1 domain-adaptation 1 windows-debloat 1 windows 1 stable-diffusion 1 rlhf 1 llm-training 1 llama 1 ai 1 agent 1 multilingual 1 llama2 1 consumer-hardware 1 code-generation 1