You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to implement a plain version of the GradCAM by myself. However, I tried to use register_module_full_backward_hook instead of register_forward_hook, I got different values for many elements of the gradient tensor, although around 70% of tensor is equal.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Hi, thanks for the amazing work!
I am trying to implement a plain version of the GradCAM by myself. However, I tried to use register_module_full_backward_hook instead of
register_forward_hook
, I got different values for many elements of the gradient tensor, although around 70% of tensor is equal.pytorch-grad-cam/pytorch_grad_cam/activations_and_gradients.py
Line 17 in 60cf39d
I wonder if there is anything that I did not figure out. Is it expected they show different results?
Beta Was this translation helpful? Give feedback.
All reactions