Log in to save to my catalogue

Can Transformers Learn Sequential Function Classes In Context?

Can Transformers Learn Sequential Function Classes In Context?

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2904764160

Can Transformers Learn Sequential Function Classes In Context?

About this item

Full title

Can Transformers Learn Sequential Function Classes In Context?

Publisher

Ithaca: Cornell University Library, arXiv.org

Journal title

arXiv.org, 2023-12

Language

English

Formats

Publication information

Publisher

Ithaca: Cornell University Library, arXiv.org

More information

Scope and Contents

Contents

In-context learning (ICL) has revolutionized the capabilities of transformer models in NLP. In our project, we extend the understanding of the mechanisms underpinning ICL by exploring whether transformers can learn from sequential, non-textual function class data distributions. We introduce a novel sliding window sequential function class and emplo...

Alternative Titles

Full title

Can Transformers Learn Sequential Function Classes In Context?

Authors, Artists and Contributors

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_proquest_journals_2904764160

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2904764160

Other Identifiers

E-ISSN

2331-8422

How to access this item