GitHub / lloydaxeph / multi_head_attention_transformer
Simple implementation of the paper "Attention Is All You Need" - https://arxiv.org/abs/1706.03762.
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Python
Size: 7.81 KB
Dependencies parsed at: Pending
Created at: about 1 year ago
Updated at: about 1 year ago
Pushed at: about 1 year ago
Last synced at: about 1 year ago
Topics: artificial-intelligence, deep-learning, large-language-models, llm, nlp-machine-learning, transformer-architecture, transformers
Loading...