Log in to save to my catalogue

Shift-and-Balance Attention

Shift-and-Balance Attention

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2505021078

Shift-and-Balance Attention

About this item

Full title

Shift-and-Balance Attention

Publisher

Ithaca: Cornell University Library, arXiv.org

Journal title

arXiv.org, 2021-03

Language

English

Formats

Publication information

Publisher

Ithaca: Cornell University Library, arXiv.org

More information

Scope and Contents

Contents

Attention is an effective mechanism to improve the deep model capability. Squeeze-and-Excite (SE) introduces a light-weight attention branch to enhance the network's representational power. The attention branch is gated using the Sigmoid function and multiplied by the feature map's trunk branch. It is too sensitive to coordinate and balance the tru...

Alternative Titles

Full title

Shift-and-Balance Attention

Authors, Artists and Contributors

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_proquest_journals_2505021078

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2505021078

Other Identifiers

E-ISSN

2331-8422

How to access this item