Topic: "inference-library"
yas-sim/openvino-ep-enabled-onnxruntime
Describing How to Enable OpenVINO Execution Provider for ONNX Runtime
Language: C++ - Size: 18.6 MB - Last synced at: 17 days ago - Pushed at: almost 5 years ago - Stars: 19 - Forks: 1

amrmorsey/Latte
Latte is a convolutional neural network (CNN) inference engine written in C++ and uses AVX to vectorize operations. The engine runs on Windows 10, Linux and macOS Sierra.
Language: C++ - Size: 239 KB - Last synced at: about 2 years ago - Pushed at: almost 7 years ago - Stars: 12 - Forks: 0

loreweaver-xyz/llm-weaver
Rust library managing long conversations with any LLM
Language: Rust - Size: 1.59 MB - Last synced at: 29 days ago - Pushed at: about 1 month ago - Stars: 8 - Forks: 1

TianyouLi/Inference-Library-for-JavaScript
Unified JavaScript API for scoring via various DL framework
Language: Jupyter Notebook - Size: 70.1 MB - Last synced at: almost 2 years ago - Pushed at: over 6 years ago - Stars: 8 - Forks: 5

yas-sim/pyopenvino
Experimental Python implementation of OpenVINO Inference Engine (very slow, limited functionality). All codes are written in Python. Easy to read and modify.
Language: Python - Size: 63 MB - Last synced at: 17 days ago - Pushed at: about 3 years ago - Stars: 7 - Forks: 1

tinyBigGAMES/Logan
Local Generative AI, unleashed. It signals a fundamental shift: no more cloud lock‑in or opaque black‑box services—now you run powerful generative AI models entirely on your own machine, giving you ultimate control over latency, privacy, and customization.
Size: 498 KB - Last synced at: 1 day ago - Pushed at: 9 days ago - Stars: 5 - Forks: 1

pfnet-research/node-menoh
NodeJS binding for Menoh DNN inference library
Language: JavaScript - Size: 6.3 MB - Last synced at: 26 days ago - Pushed at: about 4 years ago - Stars: 5 - Forks: 0

rcodin/cnn_code
Project practice Code
Language: C++ - Size: 86.9 KB - Last synced at: about 2 years ago - Pushed at: about 7 years ago - Stars: 0 - Forks: 0
