You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
Recently I am doing some work in BNN for federated learning, found this nice work(TyXe) for BNN. Is there a way that TyXe can open the parameters of BNN and then we can do some aggregation among several BNNs.
For example, if we can get the BNN1 weight as Gaussian G(mean1, variance1), another BNN2 G(mean2, variance2). We can aggregate as two gaussian fusion.
really want to use and cite this code if possible.
Thanks.
The text was updated successfully, but these errors were encountered:
Yes that should be possible. The easiest way of accessing parameters would be through pyro's parameter store, i.e. just call pyro.get_param_store(). This gives you a global key-value mapping where all parameters live.
Just be aware that if you instantiate multiple BNN objects you will need to use the name argument in the init function for the BNNs to avoid name clashes (we should mention that in the docs, thanks for bringing it up!). Let me know if you encounter any issues, I haven't tested that use case before, but would expect things to work.
Hi,
Recently I am doing some work in BNN for federated learning, found this nice work(TyXe) for BNN. Is there a way that TyXe can open the parameters of BNN and then we can do some aggregation among several BNNs.
For example, if we can get the BNN1 weight as Gaussian
G(mean1, variance1)
, another BNN2G(mean2, variance2)
. We can aggregate as two gaussian fusion.really want to use and cite this code if possible.
Thanks.
The text was updated successfully, but these errors were encountered: