GitHub / algorithmicsuperintelligence / optillm
Optimizing inference proxy for LLMs
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/algorithmicsuperintelligence%2Foptillm
PURL: pkg:github/algorithmicsuperintelligence/optillm
Stars: 3,118
Forks: 243
Open issues: 12
License: apache-2.0
Language: Python
Size: 3.08 MB
Dependencies parsed at: Pending
Created at: about 1 year ago
Updated at: 3 days ago
Pushed at: 3 days ago
Last synced at: 3 days ago
Topics: agent, agentic-ai, agentic-framework, agentic-workflow, agents, api-gateway, chain-of-thought, genai, large-language-models, llm, llm-inference, llmapi, mixture-of-experts, moa, monte-carlo-tree-search, openai, openai-api, optimization, prompt-engineering, proxy-server