GitHub / SayamAlt / English-to-Spanish-Language-Translation-using-Seq2Seq-and-Attention
Successfully established a Seq2Seq with attention model which can perform English to Spanish language translation up to an accuracy of almost 97%.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/SayamAlt%2FEnglish-to-Spanish-Language-Translation-using-Seq2Seq-and-Attention
PURL: pkg:github/SayamAlt/English-to-Spanish-Language-Translation-using-Seq2Seq-and-Attention
Stars: 0
Forks: 0
Open issues: 0
License: mit
Language: Jupyter Notebook
Size: 1.18 MB
Dependencies parsed at: Pending
Created at: about 1 year ago
Updated at: about 1 year ago
Pushed at: about 1 year ago
Last synced at: 2 months ago
Topics: attention-is-all-you-need, attention-model, bert-transformer, exploratory-data-analysis, fine-tuning-bert, hugging-face-transformers, language-translation, luong-attention, model-architecture-and-implementation, model-inference, model-training-and-evaluation, natural-language-processing, neural-machine-translation, seq2seq-modeling, text-generation, text-preprocessing, text-tokenization