GitHub topics: distillation-loss
AnanyaP-WDW/PepEmb
A transformer-based masked language model for learning amino acid sequence representations. The model uses self-attention mechanisms with custom gating and incorporates protein features for enhanced sequence understanding. Trained using BERT-style masking on peptide sequences to learn contextual amino acid embeddings.
Language: Python - Size: 47.9 KB - Last synced at: 5 months ago - Pushed at: 5 months ago - Stars: 2 - Forks: 1

iamilyasedunov/key_word_spotting
Language: Python - Size: 884 KB - Last synced at: about 2 years ago - Pushed at: over 3 years ago - Stars: 5 - Forks: 2

szq0214/S2-BNN
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)
Language: Python - Size: 240 KB - Last synced at: over 2 years ago - Pushed at: almost 4 years ago - Stars: 53 - Forks: 11
