An open API service providing repository metadata for many open source software ecosystems.

GitHub / withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level

JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/withcatai%2Fnode-llama-cpp

Stars: 1,490
Forks: 125
Open issues: 11

License: mit
Language: TypeScript
Size: 21.8 MB
Dependencies parsed at: Pending

Created at: almost 2 years ago
Updated at: 6 days ago
Pushed at: 5 days ago
Last synced at: 4 days ago

Commit Stats

Commits: 174
Authors: 6
Mean commits per author: 29.0
Development Distribution Score: 0.063
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/withcatai/node-llama-cpp

Topics: ai, bindings, catai, cmake, cmake-js, cuda, embedding, function-calling, gguf, gpu, grammar, json-schema, llama, llama-cpp, llm, metal, nodejs, prebuilt-binaries, self-hosted, vulkan

Funding Links https://github.com/sponsors/giladgd

    Loading...