Download
Abstract
Communication bottlenecks remain a key challenge in Federated Learning (FL), particularly in dynamic and resource-constrained environments. While compression strategies such as sparsification and quantization reduce communication overhead, they are typically agnostic to runtime variability and the semantic relevance of updates. This paper introduces SCALP (Selective Compression via Adaptive Lightweight Protocol), a novel hybrid communication compression mechanism that jointly considers local gradient variance and uplink bandwidth to guide adaptive filtering decisions. Each worker dynamically selects a compression level mapped to a tunable filtering ratio, balancing communication reduction and update relevance. The selected compression level is encoded as a 2-bit signal embedded in the Explicit Congestion Notification (ECN) field of the IP header, enabling stateless, lightweight signaling without modifying transport-layer protocols. Experimental results on CNN and CNN-LSTM models over the CMAPSS dataset show that SCALP reduces transmitted data by over 25% while maintaining convergence time within 2% and achieving up to 2.15% higher final accuracy compared to baseline methods. Comparative analysis against Deep Gradient Compression (DGC) and bandwidth-aware filtering confirms SCALP’s ability to integrate gradient-level relevance and network conditions for robust, efficient training in bandwidth-constrained FL scenarios.
Citation
NOT PUBLISHED YET