GitHub / EmbeddedLLM / embeddedllm
EmbeddedLLM: API server for Embedded Device Deployment. Currently support CUDA/OpenVINO/IpexLLM/DirectML/CPU
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/EmbeddedLLM%2Fembeddedllm
Stars: 37
Forks: 1
Open issues: 8
License: None
Language: Python
Size: 12.6 MB
Dependencies parsed at: Pending
Created at: 10 months ago
Updated at: 9 days ago
Pushed at: 7 months ago
Last synced at: 2 days ago
Commit Stats
Commits: 24
Authors: 4
Mean commits per author: 6.0
Development Distribution Score: 0.583
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/EmbeddedLLM/embeddedllm
Topics: aipc, cpu, directml, directx-12, gemma, ipexllm, llama, llm, llm-inference, llm-serving, mistral, model-inference, npu, open-source-llm, openvino, openvino-inference-engine, phi-3, windows