Log in to save to my catalogue

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem...

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem...

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2753449155

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

About this item

Full title

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

Publisher

Ithaca: Cornell University Library, arXiv.org

Journal title

arXiv.org, 2022-12

Language

English

Formats

Publication information

Publisher

Ithaca: Cornell University Library, arXiv.org

Subjects

More information

Scope and Contents

Contents

To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feedforward structure. The latter shows 38\% latency decrease, while impacting the Q-factor by only 0.5dB.

Alternative Titles

Full title

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_proquest_journals_2753449155

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_2753449155

Other Identifiers

E-ISSN

2331-8422

How to access this item