Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

plots don't reproduce on Binder (also tested locally) #19

Open
lukasheinrich opened this issue Jun 12, 2019 · 4 comments
Open

plots don't reproduce on Binder (also tested locally) #19

lukasheinrich opened this issue Jun 12, 2019 · 4 comments

Comments

@lukasheinrich
Copy link

lukasheinrich commented Jun 12, 2019

I wanted to show the Toy notebook to someone as a demo. The non-adversarial training works as expected, but it seems like there is a bug in the adversarial training or the Binder setup.

I see the same problems locally (but I'm mostly repeating the setup of Binder so not sure if that's a useful test)

I briefly skimmed the code and nothing obvious stands out (but I'm not too familiar with the code)

screenshot

@lukasheinrich lukasheinrich changed the title plots don't reproduce plots don't reproduce on Binder (also tested locally) Jun 12, 2019
@glouppe
Copy link
Owner

glouppe commented Jun 12, 2019

Thanks Lukas! I'll try to see what's wrong... (nothing was changed).

For the demo you wanted to show, maybe https://github.com/glouppe/notebooks/blob/master/Fair%20classifier%20with%20adversarial%20networks.ipynb would work? (although I do remember bugs with newer versions of Keras)

@lukasheinrich
Copy link
Author

lukasheinrich commented Jun 12, 2019

One subtle thing I noticed with newer versions of keras is that the argument order in the binary_crossentropy was reversed .. but fixing this I got the notebook working for the non-adversarial training in recent versions of sklearn and keras

Is it maybe some kind of sign / transposition problem? The resulting isosurfaces seem horizontal instead of vertical but maybe that’s just chance

@caph1993
Copy link

caph1993 commented Nov 18, 2019

Hello, I am also facing the same issue. Binder does not start, and when running locally, my output differs from the expected.

After correcting the order of the arguments in binary_cross_entropy, the first plot was produced nicely, but I haven't figured out how to fix the loss plots. I got the same loss plots that Lukas showed. If it helps, I am getting the warning 'Discrepancy between trainable weights and collected trainable'.

I run the simpler notebook you recommended https://github.com/glouppe/notebooks/blob/master/Fair%20classifier%20with%20adversarial%20networks.ipynb, and after switching the order of parameters in binary_cross_entropy, the loss plots were ok.

@caph1993
Copy link

I managed to recreate the results.

def make_loss_R(lam, n_components):
    nc = n_components
    def loss(z_true, z_pred):
        z_true = K.reshape(z_true, (-1,))
        mu, sigma, pi = z_pred[:, :nc], z_pred[:, nc:2*nc], z_pred[:, 2*nc:]
        ss = 2*sigma**2
        for i in range(nc):
            f = pi[:,i]*K.exp(-(z_true-mu[:,i])**2/ss[:,i])/K.sqrt(np.pi*ss[:,i])
            out = f if i==0 else out+f
        return -K.mean(K.log(out))
    return loss

I think that TensorFlow has a bug in its design, because if the line that reshapes the 'z_true' parameter is removed, the code appears to be the same and yields the same values when evaluated on random inputs, but during training it behaves differently.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants