Effects of Quantization on Federated Learning with Local Differential Privacy

Publikation: Beitrag in Buch/Konferenzbericht/Sammelband/GutachtenBeitrag in KonferenzbandBeigetragenBegutachtung

Beitragende

Abstract

Federated learning (FL) enables large-scale machine learning with user data privacy due to its decentralized structure. However, the user data can still be inferred via the shared model updates. To strengthen the privacy, we consider FL with local differential privacy (LDP). One of the challenges in FL is its huge communication cost caused by iterative transmissions of model updates. It has been relieved by quantization in the literature, however, there have been not many works that consider its effect on LDP and the unboundedness of the randomized model updates. We propose a communication-efficient FL algorithm with LDP that uses a Gaussian mechanism followed by quantization and the Elias-gamma coding. A novel design of the algorithm guarantees LDP even after the quantization. Under the proposed algorithm, we provide a trade-off analysis of privacy and communication costs theoretically: quantization reduces the communication costs but requires a larger perturbation to enable LDP. Experimental results show that the accuracy is mostly affected by the noise from LDP mechanisms, and it becomes enhanced when the quantization error is larger. Nonetheless, our experimental results enabled LDP with a significant compression ratio and only a slight reduction of accuracy in return. Furthermore, the proposed algorithm outperforms another algorithm with a discrete Gaussian mechanism under the same privacy budget and communication costs constraints in the experiments.

Details

OriginalspracheEnglisch
TitelGLOBECOM 2022 - 2022 IEEE Global Communications Conference
Seiten921-926
Seitenumfang6
ISBN (elektronisch)978-1-6654-3540-6
PublikationsstatusVeröffentlicht - 2022
Peer-Review-StatusJa

Publikationsreihe

ReiheIEEE Conference on Global Communications (GLOBECOM)
ISSN1930-529X

Konferenz

Titel2022 IEEE Global Communications Conference, GLOBECOM 2022
Dauer4 - 8 Dezember 2022
StadtVirtual, Online
LandBrasilien

Externe IDs

ORCID /0000-0002-1702-9075/work/165878276

Schlagworte

Schlagwörter

  • Elias-gamma coding, federated learning, local differential privacy, quantization