GitHub / AutonomicPerfectionist / PipeInfer
PipeInfer: Accelerating LLM Inference using Asynchronous Pipelined Speculation
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/AutonomicPerfectionist%2FPipeInfer
PURL: pkg:github/AutonomicPerfectionist/PipeInfer
Stars: 29
Forks: 4
Open issues: 1
License: mit
Language: C++
Size: 17.5 MB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: 5 months ago
Pushed at: 10 months ago
Last synced at: 5 months ago
Commit Stats
Commits: 1641
Authors: 411
Mean commits per author: 3.99
Development Distribution Score: 0.749
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/AutonomicPerfectionist/PipeInfer
Topics: inference, llamacpp, llm, speculative-decoding
Funding Links https://github.com/sponsors/AutonomicPerfectionist