Log in to save to my catalogue

NAM: Normalization-based Attention Module

NAM: Normalization-based Attention Module

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2602339013

NAM: Normalization-based Attention Module

About this item

Full title

NAM: Normalization-based Attention Module

Publisher

Ithaca: Cornell University Library, arXiv.org

Journal title

arXiv.org, 2021-11

Language

English

Formats

Publication information

Publisher

Ithaca: Cornell University Library, arXiv.org

Subjects

Subjects and topics

More information

Scope and Contents

Contents

Recognizing less salient features is the key for model compression. However, it has not been investigated in the revolutionary attention mechanisms. In this work, we propose a novel normalization-based attention module (NAM), which suppresses less salient weights. It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance. A comparison with three other attention mechanisms on both Resnet and Mobilenet indicates that our method results in higher accuracy. Code for this paper can be publicly accessed at https://github.com/Christian-lyc/NAM....

Alternative Titles

Full title

NAM: Normalization-based Attention Module

Authors, Artists and Contributors

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_proquest_journals_2602339013

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2602339013

Other Identifiers

E-ISSN

2331-8422

How to access this item