-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why is there no training_epoch_end? #1076
Comments
Hi! thanks for your contribution!, great first issue! |
didn't get around to it for this release. but feel free to PR it! |
Do you think more people would want a list of every full batch output (so the results of each training_step / training_step_end if implemented) or the accumulated batch outputs? |
I think it could be in the same way the others work, in my understanding they return a list of dicts, with each dict corresponding to the return of an epoch (return of trainining_step or training_end if exists). |
The only gotcha that we need to watch out for is that all of the collected outputs need to be detached so they don't keep the gradient trees in memory. I would suggest writing a method that recursively traverses the output dictionary and creates a new one with the same elements but all detached, then we can apply this to each output before adding it to the dict. Will also need some good tests to make sure that there aren't any leaks :) |
The message here suggest using |
Is |
It is now here thanks to @jbschiratti ! #1357 |
🚀 Feature
If i want to calculate and log average statistics for the training epoch, it seems like there is no option to define a "training_epoch_end" in the LightningModule, as there is validation_epoch_end and test_epoch_end.
Motivation
Seems very intuitive to have this function. I know the on_epoch_end hook exists, but the "outputs" object with training history for that epoch is not available.
Pitch
Same behavior of validation_epoch_end and test_epoch_end in training.
Sorry if there is something like this already, just started to use Pl. (the master version).
The text was updated successfully, but these errors were encountered: