GitHub / gazelle93 / Attention-Various-Positional-Encoding
This project aims to implement the Scaled-Dot-Product Attention layer and the Multi-Head Attention layer using various Positional Encoding methods.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/gazelle93%2FAttention-Various-Positional-Encoding
PURL: pkg:github/gazelle93/Attention-Various-Positional-Encoding
Stars: 5
Forks: 0
Open issues: 1
License: None
Language: Python
Size: 82 KB
Dependencies parsed at: Pending
Created at: about 3 years ago
Updated at: 8 months ago
Pushed at: about 3 years ago
Last synced at: 3 months ago
Topics: attention-mechanism, gensim, multi-head-attention, natural-language-processing, nlp, nltk, pytorch, relative-positional-encoding, relative-positional-representation, scaled-dot-product, spacy, t5, wordembeddings