Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks
Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks
About this item
Full title
Author / Creator
Publisher
Ithaca: Cornell University Library, arXiv.org
Journal title
Language
English
Formats
Publication information
Publisher
Ithaca: Cornell University Library, arXiv.org
Subjects
More information
Scope and Contents
Contents
Attention mechanisms, especially self-attention, have played an increasingly important role in deep feature representation for visual tasks. Self-attention updates the feature at each position by computing a weighted sum of features using pair-wise affinities across all positions to capture the long-range dependency within a single sample. However,...
Alternative Titles
Full title
Beyond Self-attention: External Attention using Two Linear Layers for Visual Tasks
Authors, Artists and Contributors
Author / Creator
Identifiers
Primary Identifiers
Record Identifier
TN_cdi_proquest_journals_2522975831
Permalink
https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2522975831
Other Identifiers
E-ISSN
2331-8422