Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism
Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism
About this item
Full title
Author / Creator
Publisher
Basel: MDPI AG
Journal title
Language
English
Formats
Publication information
Publisher
Basel: MDPI AG
Subjects
More information
Scope and Contents
Contents
Transformer models produce advanced text representations that have been used to break through the hard challenge of natural language understanding. Using the Transformer’s attention mechanism, which acts as a language learning memory, trained on tens of billions of words, a word sense disambiguation (WSD) algorithm can now construct a more faithful...
Alternative Titles
Full title
Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism
Authors, Artists and Contributors
Identifiers
Primary Identifiers
Record Identifier
TN_cdi_doaj_primary_oai_doaj_org_article_f5417d627b274095a4adda82c8171304
Permalink
https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_f5417d627b274095a4adda82c8171304
Other Identifiers
ISSN
2504-4990
E-ISSN
2504-4990
DOI
10.3390/make7010010