Nothing Special   »   [go: up one dir, main page]

Shlezinger et al., 2020 - Google Patents

Federated learning with quantization constraints

Shlezinger et al., 2020

View PDF
Document ID
10895277203537508218
Author
Shlezinger N
Chen M
Eldar Y
Poor H
Cui S
Publication year
Publication venue
ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

External Links

Snippet

Traditional deep learning models are trained on centralized servers using labeled sample data collected from edge devices. This data often includes private information, which the users may not be willing to share. Federated learning (FL) is an emerging approach to train …
Continue reading at www.researchgate.net (PDF) (other versions)

Classifications

    • HELECTRICITY
    • H03BASIC ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same information or similar information or a subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/3082Vector coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communication
    • H04L9/08Key distribution or management, e.g. generation, sharing or updating, of cryptographic keys or passwords
    • H04L9/0816Key establishment, i.e. cryptographic processes or cryptographic protocols whereby a shared secret becomes available to two or more parties, for subsequent use
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/004Arrangements for detecting or preventing errors in the information received by using forward error control
    • H04L1/0056Systems characterized by the type of code used
    • H04L1/0057Block codes
    • HELECTRICITY
    • H03BASIC ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M7/00Conversion of a code where information is represented by a given sequence or number of digits to a code where the same information or similar information or a subset of information is represented by a different sequence or number of digits
    • H03M7/30Compression; Expansion; Suppression of unnecessary data, e.g. redundancy reduction
    • H03M7/3002Conversion to or from differential modulation
    • HELECTRICITY
    • H03BASIC ELECTRONIC CIRCUITRY
    • H03MCODING; DECODING; CODE CONVERSION IN GENERAL
    • H03M3/00Conversion of analogue values to or from differential modulation
    • H03M3/30Delta-sigma modulation
    • H03M3/39Structural details of delta-sigma modulators, e.g. incremental delta-sigma modulators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L25/00Baseband systems
    • H04L25/02Details ; Arrangements for supplying electrical power along data transmission lines
    • H04L25/03Shaping networks in transmitter or receiver, e.g. adaptive shaping networks ; Receiver end arrangements for processing baseband signals

Similar Documents

Publication Publication Date Title
Shlezinger et al. Federated learning with quantization constraints
Shlezinger et al. UVeQFed: Universal vector quantization for federated learning
Gafni et al. Federated learning: A signal processing perspective
Dai et al. Nonlinear transform source-channel coding for semantic communications
Silva et al. A framework for control system design subject to average data-rate constraints
Oh et al. Communication-efficient federated learning via quantized compressed sensing
Permuter et al. Source coding with a side information “Vending Machine”
Khobahi et al. Signal recovery from 1-bit quantized noisy samples via adaptive thresholding
Lan et al. Communication-efficient federated learning for resource-constrained edge devices
Whang et al. Neural distributed source coding
Kipnis et al. Gaussian approximation of quantization error for estimation from compressed data
Zong et al. Communication reducing quantization for federated learning with local differential privacy mechanism
Yue et al. Communication-efficient federated learning via predictive coding
Abdi et al. Reducing communication overhead via CEO in distributed training
Chen et al. Communication-efficient design for quantized decentralized federated learning
Chen et al. Joint resource management and model compression for wireless federated learning
Liang et al. Wyner-Ziv gradient compression for federated learning
Ozyilkan et al. Neural distributed compressor discovers binning
Phan et al. Importance matching lemma for lossy compression with side information
Cuvelier et al. Time-invariant prefix coding for LQG control
Shlezinger et al. 1 Quantized Federated Learning
Li et al. Minimax learning for remote prediction
Chen et al. Rate distortion optimization for adaptive gradient quantization in federated learning
Saha et al. Efficient randomized subspace embeddings for distributed optimization under a communication budget
Zhang et al. An adaptive distributed source coding design for distributed learning