Log in to save to my catalogue

FedSpaLLM: Federated Pruning of Large Language Models

FedSpaLLM: Federated Pruning of Large Language Models

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_3119305516

FedSpaLLM: Federated Pruning of Large Language Models

About this item

Full title

FedSpaLLM: Federated Pruning of Large Language Models

Publisher

Ithaca: Cornell University Library, arXiv.org

Journal title

arXiv.org, 2024-10

Language

English

Formats

Publication information

Publisher

Ithaca: Cornell University Library, arXiv.org

More information

Scope and Contents

Contents

Large Language Models (LLMs) achieve state-of-the-art performance but are challenging to deploy due to their high computational and storage demands. Pruning can reduce model size, yet existing methods assume public access to calibration data, which is impractical for privacy-sensitive applications. To address the challenge of pruning LLMs in privac...

Alternative Titles

Full title

FedSpaLLM: Federated Pruning of Large Language Models

Authors, Artists and Contributors

Identifiers

Primary Identifiers

Record Identifier

TN_cdi_proquest_journals_3119305516

Permalink

https://devfeature-collection.sl.nsw.gov.au/record/TN_cdi_proquest_journals_3119305516

Other Identifiers

E-ISSN

2331-8422

How to access this item