Replies: 1 comment
-
@Chucksete thanks for the interest. Reading from the comment, you are training the global model on 1/4 of your data first and then do federated fine-tuning with 3 clients.
|
Beta Was this translation helpful? Give feedback.
-
Hello,
I have created a simple federated scenario for image classification, with a non-standard network. When I try the network non-federated, I get fairly good results. When I train it federated, I receive more or less unmodified global weights and I don't know what I am doing wrong.
Initially I split my dataset into four distinct sets. With the first split I train the global model, and the other three I use for the clients. Each one has a separate distinct split.
I initialize the global model via source_ckpt_file_full_name and set
MetaKey.NUM_STEPS_CURRENT_ROUND=1
since my data splits (for the clients) are more or less equal.I use the normal Scatter and Gather approach with
InTimeAccumulateWeightedAggregator
.After training, I download the global weights and validate it on all the data. The results are bad, it looks like the global model did not learn anything or very very little. The local models themselves show a learning curve when I use them for validation.
Unfortunately, I cannot share code.
Beta Was this translation helpful? Give feedback.
All reactions