GitHub topics: iterative-back-translation
wrmthorne/cycleformers
A Python library for efficient and flexible cycle-consistency training of transformer models via iteratie back-translation. Memory and compute efficient techniques such as PEFT adapter switching allow for 7.5x larger models to be trained on the same hardware.
Language: Python - Size: 2.43 MB - Last synced at: about 1 month ago - Pushed at: 4 months ago - Stars: 11 - Forks: 0
