GitHub / sathishkumar67 / Knowledge-Distillation-Implementation
Model Compression using Knowledge Distillation
Stars: 0
Forks: 0
Open issues: 0
License: mit
Language: Python
Size: 9.57 MB
Dependencies parsed at: Pending
Created at: 10 months ago
Updated at: 10 months ago
Pushed at: 10 months ago
Last synced at: 10 months ago
Topics: deep-learning, knowledge-distillation, model-compression
Loading...