GitHub / SuperbTUM / Faster-Distributed-Training
Faster large mini-batch distributed training w/o. squeezing devices
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SuperbTUM%2FFaster-Distributed-Training
Stars: 0
Forks: 1
Open issues: 1
License: None
Language: Python
Size: 601 KB
Dependencies parsed at: Pending
Created at: over 2 years ago
Updated at: almost 2 years ago
Pushed at: almost 2 years ago
Last synced at: 7 days ago
Topics: apex, distributed-training, fairscale, fusion, mixup, natural-gradients, onnx-runtime