GitHub / Maximilian-Winter / llama-cpp-agent
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Maximilian-Winter%2Fllama-cpp-agent
Stars: 559
Forks: 61
Open issues: 22
License: other
Language: Python
Size: 5.64 MB
Dependencies parsed at: Pending
Created at: over 1 year ago
Updated at: 5 days ago
Pushed at: 3 months ago
Last synced at: 4 days ago
Commit Stats
Commits: 468
Authors: 6
Mean commits per author: 78.0
Development Distribution Score: 0.079
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/Maximilian-Winter/llama-cpp-agent
Topics: agents, function-calling, llamacpp, llm, llm-agent, llm-framework, llms, parallel-function-call