Log in to save to my catalogue

Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks

Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2522975831

Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks

About this item

Full title

Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks

Publisher

Ithaca: Cornell University Library, arXiv.org

Journal title

arXiv.org, 2021-05

Language

English

Formats

Publication information

Publisher

Ithaca: Cornell University Library, arXiv.org

More information

Scope and Contents

Contents

Attention mechanisms, especially self-attention, have played an increasingly important role in deep feature representation for visual tasks. Self-attention updates the feature at each position by computing a weighted sum of features using pair-wise affinities across all positions to capture the long-range dependency within a single sample. However,...

Alternative Titles

Full title

Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks

Authors, Artists and Contributors

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_proquest_journals_2522975831

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2522975831

Other Identifiers

E-ISSN

2331-8422

How to access this item