GitHub / jakobhoeg / nextjs-ollama-llm-ui
Fully-featured web interface for Ollama LLMs
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/jakobhoeg%2Fnextjs-ollama-llm-ui
Stars: 1,184
Forks: 284
Open issues: 16
License: mit
Language: TypeScript
Size: 5.87 MB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: about 1 month ago
Pushed at: 3 months ago
Last synced at: 30 days ago
Commit Stats
Commits: 84
Authors: 9
Mean commits per author: 9.33
Development Distribution Score: 0.155
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/jakobhoeg/nextjs-ollama-llm-ui
Topics: ai, chatbot, gemma, llm, local, localstorage, mistral, mistral-7b, nextjs, nextjs14, offline, ollama, openai, react, shadcn, tailwindcss, typescript
Funding Links https://github.com/sponsors/jakobhoeg