GitHub topics: lxmert
hila-chefer/Transformer-MM-Explainability
[ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.
Language: Jupyter Notebook - Size: 25.3 MB - Last synced at: 28 days ago - Pushed at: over 1 year ago - Stars: 847 - Forks: 110

Muennighoff/vilio
🥶Vilio: State-of-the-art VL models in PyTorch & PaddlePaddle
Language: Python - Size: 10.4 MB - Last synced at: about 1 month ago - Pushed at: almost 2 years ago - Stars: 88 - Forks: 28

phiyodr/plxmert Fork of airsplay/lxmert
PyTorch code for Finding in NAACL 2022 paper "Probing the Role of Positional Information in Vision-Language Models".
Language: Python - Size: 433 KB - Last synced at: 12 months ago - Pushed at: almost 3 years ago - Stars: 0 - Forks: 0

itsShnik/adaptively-finetuning-transformers
Adaptively fine tuning transformer based models for multiple domains and multiple tasks
Language: Python - Size: 7.79 MB - Last synced at: about 2 years ago - Pushed at: over 2 years ago - Stars: 3 - Forks: 2

guoyang9/LXMERT-VQACP
This is an adaptation of LXMERT on both VQA-CP and VQA dataset.
Language: Python - Size: 3.39 MB - Last synced at: about 2 years ago - Pushed at: over 2 years ago - Stars: 3 - Forks: 1
