You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It is apparently now always correct to assume people are using cross-entropy loss.
Describe the solution you'd like
The most straight forward solution would be to pass the loss function being used as an extra parameter to both fgsm and pgd functions. This would be also coherent with attacks implemented in other frameworks such as in tf:
The fast_gradient_method in the Jax implementation is now by default using cross-entropy loss for crafting adversarial examples:
cleverhans/cleverhans/future/jax/attacks/fast_gradient_method.py
Line 40 in 4b5ce54
It is apparently now always correct to assume people are using cross-entropy loss.
Describe the solution you'd like
The most straight forward solution would be to pass the loss function being used as an extra parameter to both fgsm and pgd functions. This would be also coherent with attacks implemented in other frameworks such as in tf:
cleverhans/cleverhans/attacks/fast_gradient_method.py
Line 58 in 4b5ce54
An alternative would be instead of passing the predict function, we pass a model object which has the predict function and loss function registered.
The text was updated successfully, but these errors were encountered: