Skip to content

Communication Efficient (CE) - Federated Learning (FEDL)

Notifications You must be signed in to change notification settings

mathisloevenich/CE-FEDL

Repository files navigation

CE-FEDL

Communication Efficient (CE) - Federated Learning (FEDL)

This project aims to compare different methods of improving the communication efficiency of federated learning. The goal is to reduce the communication overhead, while still retaining good performance.

For the experiments, the CIFAR-10 and FEMNIST datasets are used with a 500k parameter convolutional model.

Methods

The three methods being compared are:

These methods are compared to the baseline FedAvg.

About

Communication Efficient (CE) - Federated Learning (FEDL)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •