xlm-roberta-large-ner-hrlModel42/100 via “cross-lingual transfer learning via transformer embeddings”
token-classification model by undefined. 5,82,028 downloads.
Unique: Explicitly trained on African languages (Hausa, Yoruba, Igbo) which are underrepresented in most multilingual models, improving transfer to other low-resource languages in the same linguistic families. XLM-RoBERTa's pre-training on Common Crawl includes these languages, but fine-tuning on HRL-specific data amplifies their representation in the task-specific classifier.
vs others: Achieves better zero-shot performance on African and low-resource languages than mBERT or language-specific models, while maintaining competitive performance on high-resource languages, making it the only practical single-model solution for truly global NER.