GitHub / gbaptista / ollama-ai
A Ruby gem for interacting with Ollama's API that allows you to run open source AI LLMs (Large Language Models) locally.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gbaptista%2Follama-ai
PURL: pkg:github/gbaptista/ollama-ai
Stars: 234
Forks: 10
Open issues: 4
License: mit
Language: Ruby
Size: 123 KB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: 8 days ago
Pushed at: about 1 year ago
Last synced at: 2 days ago
Commit Stats
Commits: 16
Authors: 2
Mean commits per author: 8.0
Development Distribution Score: 0.063
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/gbaptista/ollama-ai
Topics: ai, alpaca, bakllava, dolphin, llama, llama2, llava, llm, mistral, mistral-ai, mixtral, nano-bots, ollama, ollama-api, openorca, vicuna