Topic: "lr-scheduling"
wangg12/flat_anneal_scheduler.pytorch
A (warmup) (cyclic) flat and anneal learning rate scheduler in pytorch
Language: Python - Size: 6.84 KB - Last synced at: 22 days ago - Pushed at: over 4 years ago - Stars: 6 - Forks: 0

NaquibAlam/LightGBM-and-Xgboost-advanced-examples
Contains the examples which covers how to incrementally train, how to implement learning_rate scheduler, and how to implement custom objective and evaluation function in case of lightgbm/xgboost models.
Language: Jupyter Notebook - Size: 242 KB - Last synced at: about 1 year ago - Pushed at: over 4 years ago - Stars: 3 - Forks: 0

csvance/onecycle-cosine
Cosine Annealed 1cycle Policy for PyTorch
Language: Python - Size: 142 KB - Last synced at: about 1 year ago - Pushed at: almost 5 years ago - Stars: 1 - Forks: 0

kardasbart/MultiLR
A method for assigning separate learning rate schedulers to different parameters group in a model.
Language: Python - Size: 7.81 KB - Last synced at: almost 2 years ago - Pushed at: almost 2 years ago - Stars: 0 - Forks: 0

gokul-pv/AdvancedTrainingConcepts
Class activation maps, Weight Updates, Optimizers & LR Schedulers
Language: Jupyter Notebook - Size: 3.23 MB - Last synced at: about 2 years ago - Pushed at: over 2 years ago - Stars: 0 - Forks: 0

NaquibAlam/TheMisfits
Language: Jupyter Notebook - Size: 1.08 MB - Last synced at: about 1 year ago - Pushed at: almost 4 years ago - Stars: 0 - Forks: 0
