GitHub / musty-ess / Masked-Language-Model-Using-BERT
This project implements a Masked Language Model using BERT, a transformer-based model developed by Google, to predict masked words in text sequences.
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Python
Size: 124 KB
Dependencies parsed at: Pending
Created at: 8 months ago
Updated at: 8 months ago
Pushed at: 8 months ago
Last synced at: about 2 months ago
Topics: ai, artificial-intelligence, bert, bert-model, language-model, masked-language-models, masked-word-prediction, natural-language-processing, nlp, python, tensorflow, transformers, transformers-models, visualization