Topic: "graph-knowledge-distillation"
LirongWu/FF-G2M
Code for AAAI 2023 (Oral) paper "Extracting Low-/High- Frequency Knowledge from Graph Neural Networks and Injecting it into MLPs: An Effective GNN-to-MLP Distillation Framework"
Language: Python - Size: 69.3 KB - Last synced at: 11 days ago - Pushed at: 9 months ago - Stars: 24 - Forks: 3

LirongWu/KRD
Code for ICML 2023 paper "Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs"
Language: Python - Size: 7.81 MB - Last synced at: 21 days ago - Pushed at: almost 2 years ago - Stars: 17 - Forks: 4

LirongWu/TGS
Code for TKDE paper "A Teacher-Free Graph Knowledge Distillation Framework with Dual Self-Distillation"
Language: Python - Size: 806 KB - Last synced at: 11 days ago - Pushed at: 9 months ago - Stars: 7 - Forks: 4

LirongWu/HGMD
Code fo CIKM2024 paper "Teach Harder, Learn Poorer: Rethinking Hard Sample Distillation for GNN-to-MLP Knowledge Distillation"
Language: Python - Size: 7.9 MB - Last synced at: 2 months ago - Pushed at: 2 months ago - Stars: 2 - Forks: 1

snoai/magi-markdown
MAGI: Markdown for Agent Guidance & Instruction - A next-generation markdown extension designed specifically for AI systems. MAGI enhances standard markdown with structured metadata, embedded AI instructions, and explicit document relationships, creating a seamless bridge between human-readable content and LLM/agent processing. Perfect for RAG,KAG
Language: TypeScript - Size: 512 KB - Last synced at: 3 days ago - Pushed at: 3 days ago - Stars: 0 - Forks: 0
