A novel quantization method combined with knowledge distillation for deep neural networks
A novel quantization method combined with knowledge distillation for deep neural networks
About this item
Full title
Author / Creator
Publisher
Bristol: IOP Publishing
Journal title
Language
English
Formats
Publication information
Publisher
Bristol: IOP Publishing
Subjects
More information
Scope and Contents
Contents
The massive parameters and intensive computations in neural networks always limit the deployment on embedded devices with poor storage and computing power. To solve this problem, a novel quantization algorithm combined with Knowledge Distillation (KD) is proposed to reduce the model size and speed up the inference of deep models. The proposed metho...
Alternative Titles
Full title
A novel quantization method combined with knowledge distillation for deep neural networks
Authors, Artists and Contributors
Author / Creator
Identifiers
Primary Identifiers
Record Identifier
TN_cdi_proquest_journals_2557518892
Permalink
https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2557518892
Other Identifiers
ISSN
1742-6588
E-ISSN
1742-6596
DOI
10.1088/1742-6596/1976/1/012026