Skip to content

Commit d4571d1

Browse files
Ir1dwilliamFalcon
authored andcommitted
filter param with no grad (#579)
1 parent b5b77e4 commit d4571d1

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pytorch_lightning/trainer/training_tricks_mixin.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ def clip_gradients(self):
1313
def print_nan_gradients(self):
1414
model = self.get_model()
1515
for param in model.parameters():
16-
if torch.isnan(param.grad.float()).any():
16+
if (param.grad is not None) and torch.isnan(param.grad.float()).any():
1717
logging.info(param, param.grad)
1818

1919
def configure_accumulated_gradients(self, accumulate_grad_batches):

0 commit comments

Comments
 (0)