ASTER: Natural and Multi-language Unit Test Generation with LLMs
Rangeet Pan, Myeongsoo Kim, et al.
ICSE 2025
We introduce a novel approach for optimizing communication efficiency in Federated Learning (FL). The approach leverages sketching techniques in two complementary strategies that exploit similarities on the data transmitted during the FL training process to identify opportunities for skipping expensive communication of updated models in training iterations, and dynamically select subsets of clients hosting diverse models. Our extensive experimental investigation on different models, datasets and label distributions, shows that these strategies can massively reduce downlink and uplink communication volumes by factors order of 100× or more with minor degradation or even increase of the accuracy of the trained model. Also, in contrast to baselines, these strategies can escape suboptimal descent paths and can yield smooth non-oscillatory accuracy profiles for non-IID data distributions.
Rangeet Pan, Myeongsoo Kim, et al.
ICSE 2025
Cristiano Malossi, Roy Assaf, et al.
IABMAS 2024
S. Ilker Birbil, Donato Maragno, et al.
AAAI 2023
Alexander Erben, Gauri Joshi, et al.
ICML 2025