Prompt Learning with Structured Semantic Knowledge Makes Pre-Trained Language Models Better
Prompt Learning with Structured Semantic Knowledge Makes Pre-Trained Language Models Better
About this item
Full title
Author / Creator
Publisher
Basel: MDPI AG
Journal title
Language
English
Formats
Publication information
Publisher
Basel: MDPI AG
Subjects
More information
Scope and Contents
Contents
Pre-trained language models with structured semantic knowledge have demonstrated remarkable performance in a variety of downstream natural language processing tasks. The typical methods of integrating knowledge are designing different pre-training tasks and training from scratch, which requires high-end hardware, massive storage resources, and long...
Alternative Titles
Full title
Prompt Learning with Structured Semantic Knowledge Makes Pre-Trained Language Models Better
Authors, Artists and Contributors
Author / Creator
Identifiers
Primary Identifiers
Record Identifier
TN_cdi_proquest_journals_2849027020
Permalink
https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2849027020
Other Identifiers
ISSN
2079-9292
E-ISSN
2079-9292
DOI
10.3390/electronics12153281