You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Greak work! But i have a problem when finetuning the model. I added the following code in main/trainer.py file, trying to inspect the gradients for the dynamicrafter model when fintuning on my own dataset, but i see all None gradient for every parameters in the model, why is that happenning?
class PrintGradientsCallback(pl.Callback):
def on_after_backward(self, trainer, pl_module):
print("Gradients after backward pass:")
for name, param in pl_module.named_parameters():
if param.grad is not None:
print(f"Gradient of {name}: {param.grad.norm()}")
trainer_kwargs["callbacks"].append(PrintGradientsCallback())
The text was updated successfully, but these errors were encountered:
Greak work! But i have a problem when finetuning the model. I added the following code in
main/trainer.py
file, trying to inspect the gradients for the dynamicrafter model when fintuning on my own dataset, but i see all None gradient for every parameters in the model, why is that happenning?The text was updated successfully, but these errors were encountered: