GitHub topics: distilbert-fine-tuning
SayamAlt/Mental-Health-Classification-using-fine-tuned-DistilBERT
Successfully established a multiclass text classification model by fine-tuning pretrained DistilBERT transformer model to classify several distinct types of mental health statuses such as anxiety, stress, personality disorder, etc. with an accuracy of 77%.
Language: Jupyter Notebook - Size: 2.07 MB - Last synced at: 7 months ago - Pushed at: 8 months ago - Stars: 0 - Forks: 0

SayamAlt/Luxury-Apparel-Product-Category-Classification-using-fine-tuned-DistilBERT
Successfully developed a multiclass text classification model by fine-tuning pretrained DistilBERT transformer model to classify various distinct types of luxury apparels into their respective categories i.e. pants, accessories, underwear, shoes, etc.
Language: Jupyter Notebook - Size: 3.7 MB - Last synced at: 7 months ago - Pushed at: 8 months ago - Stars: 0 - Forks: 0

unnamed-catalyst/Fine-Tuning-Transformers
Code for a comparative analysis of the performance of fine-tuned transformer models on climate change data. The transformer models used were BERT, DistilBERT and RoBERTa.
Language: Jupyter Notebook - Size: 229 KB - Last synced at: 3 months ago - Pushed at: 10 months ago - Stars: 0 - Forks: 0

abirmondal/multi-label-hate-speech-classification
In this project we have tried to do multi-label hate-speech classification in Bengali and Hindi language using fill-mask transformer models.
Language: Jupyter Notebook - Size: 2.12 MB - Last synced at: over 1 year ago - Pushed at: over 1 year ago - Stars: 0 - Forks: 3
