Proceedings:
No. 4: AAAI-22 Technical Tracks 4
Volume
Issue:
Proceedings of the AAAI Conference on Artificial Intelligence, 36
Track:
AAAI Technical Track on Data Mining and Knowledge Management
Downloads:
Abstract:
Top-k sparsification method is popular and powerful forreducing the communication cost in Federated Learning(FL). However, according to our experimental observation, it spends most of the total communication cost on the index of the selected parameters (i.e., their position informa-tion), which is inefficient for FL training. To solve this problem, we propose a FL compression algorithm for convolution neural networks (CNNs), called SmartIdx, by extending the traditional Top-k largest variation selection strategy intothe convolution-kernel-based selection, to reduce the proportion of the index in the overall communication cost and thusachieve a high compression ratio. The basic idea of SmartIdx is to improve the 1:1 proportion relationship betweenthe value and index of the parameters to n:1, by regarding the convolution kernel as the basic selecting unit in parameter selection, which can potentially deliver more informationto the parameter server under the limited network traffic. Tothis end, a set of rules are designed for judging which kernel should be selected and the corresponding packaging strategies are also proposed for further improving the compressionratio. Experiments on mainstream CNNs and datasets show that our proposed SmartIdx performs 2.5×−69.2× higher compression ratio than the state-of-the-art FL compression algorithms without degrading model performance.
DOI:
10.1609/aaai.v36i4.20345
AAAI
Proceedings of the AAAI Conference on Artificial Intelligence, 36