GitHub / pcompieta / basic-llm-wrapper-cli-flask
Basic LLM wrapper with 3 consumers: CLI, Flask, and load/scaling test. Works with HuggingFace models like Llama and TinyLlama (and others).
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/pcompieta%2Fbasic-llm-wrapper-cli-flask
PURL: pkg:github/pcompieta/basic-llm-wrapper-cli-flask
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Python
Size: 17.6 KB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: over 1 year ago
Pushed at: over 1 year ago
Last synced at: 4 days ago
Commit Stats
Commits: 11
Authors: 2
Mean commits per author: 5.5
Development Distribution Score: 0.273
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/pcompieta/basic-llm-wrapper-cli-flask
Topics: cli, flask, llama2, llm, loadtest, performance-testing, python, stress-test, tinyllama, wrapper