GitHub / varunvasudeva1 / llm-server-docs
Documentation on setting up an LLM server on Debian from scratch, using Ollama/vLLM, Open WebUI, OpenedAI Speech/Kokoro FastAPI, and ComfyUI.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/varunvasudeva1%2Fllm-server-docs
Stars: 435
Forks: 37
Open issues: 1
License: mit
Language:
Size: 32.2 KB
Dependencies parsed at: Pending
Created at: about 1 year ago
Updated at: 6 days ago
Pushed at: 6 days ago
Last synced at: 6 days ago
Topics: comfyui, debian, linux, llm, ollama, open-webui, openedai-speech, server, vllm