GitHub / xujiajun / gotokenizer
A tokenizer based on the dictionary and Bigram language models for Go. (Now only support chinese segmentation)
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/xujiajun%2Fgotokenizer
PURL: pkg:github/xujiajun/gotokenizer
Stars: 21
Forks: 7
Open issues: 0
License: apache-2.0
Language: Go
Size: 10.1 MB
Dependencies parsed at: Pending
Created at: almost 7 years ago
Updated at: 5 months ago
Pushed at: over 6 years ago
Last synced at: 28 days ago
Topics: golang, segmentation, tokenizer