Add ckpt_path option to LightningModule.test()
introduced robustness issues
#2275
Labels
Milestone
Add ckpt_path option to LightningModule.test()
introduced robustness issues
#2275
🐛 Bug
Trainer.test()
, passing no explicit model, now looks for the best model checkpoint. However, this can fail for models that have__init__
methods that take more than justhparams
, which is common.If I have a
pl
model with init (kw)args(self, hparams: Namespace, fancy_thing: FancyThing)
one runs into problems outlined below.With the new codepath,
Trainer.test()
calls down via this pathWhich soon calls down to
pytorch_lightning/core/saving.py
@_load_model_state
.This correctly collects
init_args_name
(herehparams, fancy_thing
) but the subsequent logic does not check whether all the requisite params have been foundThis means the final line of that snippet fails because
fancy_thing
is not passed tocls
.Even if it were checked, though, there is not a mechanism here to recover
fancy_thing
. Lightning might need to e.g. pickle the model init args to rehydrate them later, or provide a user hook specifying how the rehydration here should occur. That said, there may be a simple work around I do not see immediately.To Reproduce
Call
Trainer.test()
passing no model, where the implicit model has init args that are not justhparams
.Code sample
I've tried to document very clearly above the code flow that triggers this, but please @ me for any more context.
Expected behavior
Rehydration should occur properly, or, as a stopgap, an error should be raised explaining that model must be explicitly be passed to
Trainer.test
in this case.Additional context
Thanks for the great library; really appreciate the team's work!
The text was updated successfully, but these errors were encountered: