Communication Efficient (CE) - Federated Learning (FEDL)
This project aims to compare different methods of improving the communication efficiency of federated learning. The goal is to reduce the communication overhead, while still retaining good performance.
For the experiments, the CIFAR-10 and FEMNIST datasets are used with a 500k parameter convolutional model.
The three methods being compared are:
These methods are compared to the baseline FedAvg.