GitHub / SqueezeAILab / LLMCompiler
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SqueezeAILab%2FLLMCompiler
PURL: pkg:github/SqueezeAILab/LLMCompiler
Stars: 1,715
Forks: 123
Open issues: 5
License: mit
Language: Python
Size: 375 KB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: 5 days ago
Pushed at: about 1 year ago
Last synced at: 4 days ago
Commit Stats
Commits: 34
Authors: 4
Mean commits per author: 8.5
Development Distribution Score: 0.088
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/SqueezeAILab/LLMCompiler
Topics: efficient-inference, function-calling, large-language-models, llama, llama2, llm, llm-agent, llm-agents, llm-framework, llms, natural-language-processing, nlp, parallel-function-call, transformer