Can Transformers Learn Sequential Function Classes In Context?
Can Transformers Learn Sequential Function Classes In Context?
About this item
Full title
Author / Creator
Campbell, Ryan , Guo, Emma , Hu, Evan , Reya Vir and Hsiao, Ethan
Publisher
Ithaca: Cornell University Library, arXiv.org
Journal title
Language
English
Formats
Publication information
Publisher
Ithaca: Cornell University Library, arXiv.org
Subjects
More information
Scope and Contents
Contents
In-context learning (ICL) has revolutionized the capabilities of transformer models in NLP. In our project, we extend the understanding of the mechanisms underpinning ICL by exploring whether transformers can learn from sequential, non-textual function class data distributions. We introduce a novel sliding window sequential function class and emplo...
Alternative Titles
Full title
Can Transformers Learn Sequential Function Classes In Context?
Authors, Artists and Contributors
Author / Creator
Identifiers
Primary Identifiers
Record Identifier
TN_cdi_proquest_journals_2904764160
Permalink
https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2904764160
Other Identifiers
E-ISSN
2331-8422