Log in to save to my catalogue

Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism

Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_f5417d627b274095a4adda82c8171304

Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism

About this item

Full title

Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism

Publisher

Basel: MDPI AG

Journal title

Machine learning and knowledge extraction, 2025-03, Vol.7 (1), p.10

Language

English

Formats

Publication information

Publisher

Basel: MDPI AG

More information

Scope and Contents

Contents

Transformer models produce advanced text representations that have been used to break through the hard challenge of natural language understanding. Using the Transformer’s attention mechanism, which acts as a language learning memory, trained on tens of billions of words, a word sense disambiguation (WSD) algorithm can now construct a more faithful...

Alternative Titles

Full title

Unsupervised Word Sense Disambiguation Using Transformer’s Attention Mechanism

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_doaj_primary_oai_doaj_org_article_f5417d627b274095a4adda82c8171304

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_doaj_primary_oai_doaj_org_article_f5417d627b274095a4adda82c8171304

Other Identifiers

ISSN

2504-4990

E-ISSN

2504-4990

DOI

10.3390/make7010010

How to access this item