An open API service providing repository metadata for many open source software ecosystems.

GitHub / Lance0218 / Pytorch-DistributedDataParallel-Training-Tricks

A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.

JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Lance0218%2FPytorch-DistributedDataParallel-Training-Tricks
PURL: pkg:github/Lance0218/Pytorch-DistributedDataParallel-Training-Tricks

Stars: 45
Forks: 8
Open issues: 0

License: mit
Language: Python
Size: 101 MB
Dependencies parsed at: Pending

Created at: almost 5 years ago
Updated at: over 2 years ago
Pushed at: about 3 years ago
Last synced at: over 2 years ago

Topics: apex, distributed, early-stopping, learning-rate-scheduling, pytorch, pytorch-distributeddataparallel, random-seeds, warmup

    Loading...