GitHub / dwiaskor99 / contrastive-distillation
CAST is a method for semi-supervised instance segmentation that efficiently trains a compact model using both labeled and unlabeled data. This repository contains the implementation of our three-stage pipeline, showcasing contrastive adaptation and distillation techniques. 🐙🌟
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/dwiaskor99%2Fcontrastive-distillation
PURL: pkg:github/dwiaskor99/contrastive-distillation
Stars: 0
Forks: 0
Open issues: 0
License: mit
Language:
Size: 3.2 MB
Dependencies parsed at: Pending
Created at: about 2 months ago
Updated at: about 1 month ago
Pushed at: about 1 month ago
Last synced at: about 1 month ago
Topics: 3d-representation-learning, benchmark, computer-vision, contrastive-cot-prompting, data-free, dense-contrastive-learning, distillation-contrastive-decoding, graph-neural-networks, grounding-dino, image-based-person-re-id, knowledge-dist, model-compression, neurips-2023, person-re-identification, sam2, self-supervised-learning, semi-supervised-knowledge-distillation, vision-foundation-model