Shlezinger et al., 2020 - Google Patents
Federated learning with quantization constraintsShlezinger et al., 2020
View PDF- Document ID
- 10895277203537508218
- Author
- Shlezinger N
- Chen M
- Eldar Y
- Poor H
- Cui S
- Publication year
- Publication venue
- ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
External Links
Snippet
Traditional deep learning models are trained on centralized servers using labeled sample data collected from edge devices. This data often includes private information, which the users may not be willing to share. Federated learning (FL) is an emerging approach to train …
- 238000009826 distribution 0 description 6
Classifications
-
- H—ELECTRICITY
- H03—BASIC ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M7/00—Conversion of a code where information is represented by a given sequence or number of digits to a code where the same information or similar information or a subset of information is represented by a different sequence or number of digits
- H03M7/30—Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
- H03M7/3082—Vector coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L9/00—Cryptographic mechanisms or cryptographic arrangements for secret or secure communication
- H04L9/08—Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
- H04L9/0816—Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L1/00—Arrangements for detecting or preventing errors in the information received
- H04L1/004—Arrangements for detecting or preventing errors in the information received by using forward error control
- H04L1/0056—Systems characterized by the type of code used
- H04L1/0057—Block codes
-
- H—ELECTRICITY
- H03—BASIC ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M7/00—Conversion of a code where information is represented by a given sequence or number of digits to a code where the same information or similar information or a subset of information is represented by a different sequence or number of digits
- H03M7/30—Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
- H03M7/3002—Conversion to or from differential modulation
-
- H—ELECTRICITY
- H03—BASIC ELECTRONIC CIRCUITRY
- H03M—CODING; DECODING; CODE CONVERSION IN GENERAL
- H03M3/00—Conversion of analogue values to or from differential modulation
- H03M3/30—Delta-sigma modulation
- H03M3/39—Structural details of delta-sigma modulators, e.g. incremental delta-sigma modulators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L25/00—Baseband systems
- H04L25/02—Details ; Arrangements for supplying electrical power along data transmission lines
- H04L25/03—Shaping networks in transmitter or receiver, e.g. adaptive shaping networks ; Receiver end arrangements for processing baseband signals
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Shlezinger et al. | Federated learning with quantization constraints | |
Shlezinger et al. | UVeQFed: Universal vector quantization for federated learning | |
Gafni et al. | Federated learning: A signal processing perspective | |
Dai et al. | Nonlinear transform source-channel coding for semantic communications | |
Silva et al. | A framework for control system design subject to average data-rate constraints | |
Oh et al. | Communication-efficient federated learning via quantized compressed sensing | |
Permuter et al. | Source coding with a side information “Vending Machine” | |
Khobahi et al. | Signal recovery from 1-bit quantized noisy samples via adaptive thresholding | |
Lan et al. | Communication-efficient federated learning for resource-constrained edge devices | |
Whang et al. | Neural distributed source coding | |
Kipnis et al. | Gaussian approximation of quantization error for estimation from compressed data | |
Zong et al. | Communication reducing quantization for federated learning with local differential privacy mechanism | |
Yue et al. | Communication-efficient federated learning via predictive coding | |
Abdi et al. | Reducing communication overhead via CEO in distributed training | |
Chen et al. | Communication-efficient design for quantized decentralized federated learning | |
Chen et al. | Joint resource management and model compression for wireless federated learning | |
Liang et al. | Wyner-Ziv gradient compression for federated learning | |
Ozyilkan et al. | Neural distributed compressor discovers binning | |
Phan et al. | Importance matching lemma for lossy compression with side information | |
Cuvelier et al. | Time-invariant prefix coding for LQG control | |
Shlezinger et al. | 1 Quantized Federated Learning | |
Li et al. | Minimax learning for remote prediction | |
Chen et al. | Rate distortion optimization for adaptive gradient quantization in federated learning | |
Saha et al. | Efficient randomized subspace embeddings for distributed optimization under a communication budget | |
Zhang et al. | An adaptive distributed source coding design for distributed learning |