Find Paper, Faster
Example:10.1021/acsami.1c06204 or Chem. Rev., 2007, 107, 2411-2502
A Neural Joint Model with BERT for Burmese Syllable Segmentation, Word Segmentation, and POS Tagging
ACM Transactions on Asian and Low-Resource Language Information Processing  (IF1.413),  Pub Date : 2021-05-26, DOI: 10.1145/3436818
Cunli Mao, Zhibo Man, Zhengtao Yu, Shengxiang Gao, Zhenhan Wang, Hongbin Wang

The smallest semantic unit of the Burmese language is called the syllable. In the present study, it is intended to propose the first neural joint learning model for Burmese syllable segmentation, word segmentation, and part-of-speech ( POS ) tagging with the BERT. The proposed model alleviates the error propagation problem of the syllable segmentation. More specifically, it extends the neural joint model for Vietnamese word segmentation, POS tagging, and dependency parsing [28] with the pre-training method of the Burmese character, syllable, and word embedding with BiLSTM-CRF-based neural layers. In order to evaluate the performance of the proposed model, experiments are carried out on Burmese benchmark datasets, and we fine-tune the model of multilingual BERT. Obtained results show that the proposed joint model can result in an excellent performance.