Concepedia

Publication | Open Access

Federated Learning with Matched Averaging

102

Citations

16

References

2020

Year

TLDR

Federated learning enables edge devices to collaboratively train a shared model while keeping data locally, thereby separating training from cloud storage. The paper proposes the Federated matched averaging (FedMA) algorithm for federated learning of modern neural network architectures. FedMA constructs a global model layer‑wise by matching and averaging hidden elements—channels in convolution layers, hidden states in LSTMs, and neurons in fully connected layers—based on similar feature‑extraction signatures. Experiments show that FedMA outperforms leading federated learning algorithms on deep CNN and LSTM models trained on real‑world data while also reducing communication overhead.

Abstract

Federated learning allows edge devices to collaboratively learn a shared model while keeping the training data on device, decoupling the ability to do model training from the need to store the data in the cloud. We propose Federated matched averaging (FedMA) algorithm designed for federated learning of modern neural network architectures e.g. convolutional neural networks (CNNs) and LSTMs. FedMA constructs the shared global model in a layer-wise manner by matching and averaging hidden elements (i.e. channels for convolution layers; hidden states for LSTM; neurons for fully connected layers) with similar feature extraction signatures. Our experiments indicate that FedMA not only outperforms popular state-of-the-art federated learning algorithms on deep CNN and LSTM architectures trained on real world datasets, but also reduces the overall communication burden.

References

YearCitations

Page 1