Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
About this item
Full title
Author / Creator
Publisher
Basel: MDPI AG
Journal title
Language
English
Formats
Publication information
Publisher
Basel: MDPI AG
Subjects
More information
Scope and Contents
Contents
Despite a few attempts to automatically crawl Ewe text from online news portals and magazines, the African Ewe language remains underdeveloped despite its rich morphology and complex "unique" structure. This is due to the poor quality, unbalanced, and religious-based nature of the crawled Ewe texts, thus making it challenging to preprocess and perf...
Alternative Titles
Full title
Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
Authors, Artists and Contributors
Identifiers
Primary Identifiers
Record Identifier
TN_cdi_doaj_primary_oai_doaj_org_article_3d9b12512dfc430d8bc3de2dc4fcd41b
Permalink
https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_3d9b12512dfc430d8bc3de2dc4fcd41b
Other Identifiers
ISSN
2079-8954
E-ISSN
2079-8954
DOI
10.3390/systems12010001