GitHub / musty-ess / Masked-Language-Model-Using-BERT
This project implements a Masked Language Model using BERT, a transformer-based model developed by Google, to predict masked words in text sequences.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/musty-ess%2FMasked-Language-Model-Using-BERT
PURL: pkg:github/musty-ess/Masked-Language-Model-Using-BERT
Stars: 0
Forks: 0
Open issues: 0
License: None
Language: Python
Size: 124 KB
Dependencies parsed at: Pending
Created at: 8 months ago
Updated at: 8 months ago
Pushed at: 8 months ago
Last synced at: 2 months ago
Topics: ai, artificial-intelligence, bert, bert-model, language-model, masked-language-models, masked-word-prediction, natural-language-processing, nlp, python, tensorflow, transformers, transformers-models, visualization