Ecosyste.ms: Repos
An open API service providing repository metadata for many open source software ecosystems.
GitHub / intel-analytics / ipex-llm
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, Baichuan, Mixtral, Gemma, etc.) on Intel CPU and GPU (e.g., local PC with iGPU, discrete GPU such as Arc, Flex and Max). A PyTorch LLM library that seamlessly integrates with llama.cpp, Ollama, HuggingFace, LangChain, LlamaIndex, DeepSpeed, vLLM, FastChat, etc.
JSON API: https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/intel-analytics%2Fipex-llm
Stars: 6,069
Forks: 1,208
Open Issues: 1,055
License: apache-2.0
Language: Python
Repo Size: 228 MB
Dependencies:
0
Created: over 7 years ago
Updated: 1 day ago
Last pushed: 1 day ago
Last synced: about 24 hours ago
Commit Stats
Commits: 18964
Authors: 206
Mean commits per author: 92.06
Development Distribution Score: 0.9
More commit stats: https://commits.ecosyste.ms/hosts/GitHub/repositories/intel-analytics/ipex-llm
Topics: gpu, llm, pytorch, transformers
Files
No dependencies found