Log in to save to my catalogue

Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language

Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_3d9b12512dfc430d8bc3de2dc4fcd41b

Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language

About this item

Full title

Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language

Publisher

Basel: MDPI AG

Journal title

Systems (Basel), 2024-01, Vol.12 (1), p.1

Language

English

Formats

Publication information

Publisher

Basel: MDPI AG

More information

Scope and Contents

Contents

Despite a few attempts to automatically crawl Ewe text from online news portals and magazines, the African Ewe language remains underdeveloped despite its rich morphology and complex "unique" structure. This is due to the poor quality, unbalanced, and religious-based nature of the crawled Ewe texts, thus making it challenging to preprocess and perf...

Alternative Titles

Full title

Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_doaj_primary_oai_doaj_org_article_3d9b12512dfc430d8bc3de2dc4fcd41b

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_3d9b12512dfc430d8bc3de2dc4fcd41b

Other Identifiers

ISSN

2079-8954

E-ISSN

2079-8954

DOI

10.3390/systems12010001

How to access this item