GitHub / arnavdantuluri / long-context-transformers
A repository to get train transformers to access longer context for causal language models, most of these methods are still in testing. Try them out if you'd like but please lmk your results so we don't duplicate work :)
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/arnavdantuluri%2Flong-context-transformers
PURL: pkg:github/arnavdantuluri/long-context-transformers
Stars: 5
Forks: 2
Open issues: 0
License: None
Language: Python
Size: 188 KB
Dependencies parsed at: Pending
Created at: about 2 years ago
Updated at: over 1 year ago
Pushed at: about 2 years ago
Last synced at: over 1 year ago
Topics: attention-mechanisms, finetuning, long-context-attention, long-context-transformers