GitHub / Kyubyong / neural_tokenizer
Tokenize English sentences using neural networks.
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Kyubyong%2Fneural_tokenizer
Stars: 63
Forks: 9
Open issues: 0
License: mit
Language: Python
Size: 177 KB
Dependencies parsed at: Pending
Created at: over 8 years ago
Updated at: 3 months ago
Pushed at: over 7 years ago
Last synced at: 18 days ago
Topics: language, neural-network, tokenizer
Loading...