GitHub / agoSantiago97 / gemma-2-2b-it.cs
# gemma-2-2b-it.csThis project implements int8 CPU inference in pure C#. It ports a Rust repository using Gemini 2.5 Pro Preview, and you can easily build and run it with the provided batch files. 🐙💻
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/agoSantiago97%2Fgemma-2-2b-it.cs
PURL: pkg:github/agoSantiago97/gemma-2-2b-it.cs
Stars: 2
Forks: 0
Open issues: 0
License: None
Language: C#
Size: 15.6 KB
Dependencies parsed at: Pending
Created at: about 1 month ago
Updated at: 16 days ago
Pushed at: 16 days ago
Last synced at: 16 days ago
Topics: cpu-inference, csharp, gemma, gemma2, gemma2-2b-it, inference, inference-engine, int8, int8-inference, int8-quantization, llm, llm-inference, llm-serving, quantization