Repository logoRepository logo
 

NCHLT isiNdebele RoBERTa language model

Loading...
Thumbnail Image

Date

2023-05-01

Authors

Roald Eiselen

Journal Title

Journal ISSN

Volume Title

Publisher

North-West University; Centre for Text Technology (CTexT)

Abstract

Description

Contextual masked language model based on the RoBERTa architecture (Liu et al., 2019). The model is trained as a masked language model and not fine-tuned for any downstream process. The model can be used both as a masked LM or as an embedding model to provide real-valued vectorised respresentations of words or string sequences for isiNdebele text.

Keywords

Citation

License

Creative Commons Attribution 4.0 International (CC-BY 4.0)

Verification status

Level 0