GitHub / bentoml / BentoML
The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/bentoml%2FBentoML
PURL: pkg:github/bentoml/BentoML
Stars: 7,914
Forks: 859
Open issues: 139
License: apache-2.0
Language: Python
Size: 97.6 MB
Dependencies parsed at: Pending
Created at: over 6 years ago
Updated at: about 23 hours ago
Pushed at: about 13 hours ago
Last synced at: about 11 hours ago
Commit Stats
Commits: 2382
Authors: 164
Mean commits per author: 14.52
Development Distribution Score: 0.77
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/bentoml/BentoML
Topics: ai-inference, deep-learning, generative-ai, inference-platform, llm, llm-inference, llm-serving, llmops, machine-learning, ml-engineering, mlops, model-inference-service, model-serving, multimodal, python