Skip to content

Commit 27bba1a

Browse files
author
Peter Izsak
authored
Fix global_step when gradient accumulation > 1 (#832)
1 parent 4ae31cd commit 27bba1a

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

pytorch_lightning/trainer/training_loop.py

+3-1
Original file line numberDiff line numberDiff line change
@@ -426,7 +426,9 @@ def run_training_epoch(self):
426426
# logs user requested information to logger
427427
self.log_metrics(batch_step_metrics, grad_norm_dic)
428428

429-
self.global_step += 1
429+
# progress global step according to grads progress
430+
if (self.batch_idx + 1) % self.accumulate_grad_batches == 0:
431+
self.global_step += 1
430432
self.total_batch_idx += 1
431433

432434
# end epoch early

0 commit comments

Comments
 (0)