GitHub / InternLM / InternEvo
InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/InternLM%2FInternEvo
PURL: pkg:github/InternLM/InternEvo
Stars: 404
Forks: 70
Open issues: 70
License: apache-2.0
Language: Python
Size: 6.79 MB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: 14 days ago
Pushed at: 14 days ago
Last synced at: 14 days ago
Topics: 910b, deepspeed-ulysses, flash-attention, gemma, internlm, internlm2, llama3, llava, llm-framework, llm-training, multi-modal, pipeline-parallelism, pytorch, ring-attention, sequence-parallelism, tensor-parallelism, transformers-models, zero3