GitHub / substratusai / kubeai
AI Inference Operator for Kubernetes. The easiest way to serve ML models in production. Supports VLMs, LLMs, embeddings, and speech-to-text.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/substratusai%2Fkubeai
PURL: pkg:github/substratusai/kubeai
Stars: 1,021
Forks: 97
Open issues: 110
License: apache-2.0
Language: Go
Size: 16.7 MB
Dependencies parsed at: Pending
Created at: almost 2 years ago
Updated at: 16 days ago
Pushed at: 15 days ago
Last synced at: 14 days ago
Commit Stats
Commits: 178
Authors: 11
Mean commits per author: 16.18
Development Distribution Score: 0.449
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/substratusai/kubeai
Topics: ai, autoscaler, faster-whisper, inference-operator, k8s, kubernetes, llm, ollama, ollama-operator, openai-api, vllm, vllm-operator, whisper