An open API service providing repository metadata for many open source software ecosystems.

Topic: "flash-mla"

xlite-dev/Awesome-LLM-Inference

📚A curated list of Awesome LLM/VLM Inference Papers with codes: WINT8/4, FlashAttention, PagedAttention, Parallelism, MLA, etc.

Language: Python - Size: 115 MB - Last synced at: about 1 hour ago - Pushed at: about 2 hours ago - Stars: 4,100 - Forks: 283

xlite-dev/ffpa-attn

📚FFPA(Split-D): Extend FlashAttention with Split-D for large headdim, O(1) GPU SRAM complexity, 1.8x~3x↑🎉 faster than SDPA EA.

Language: Cuda - Size: 4.21 MB - Last synced at: 7 days ago - Pushed at: 30 days ago - Stars: 183 - Forks: 8