-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
load_from_checkpoint: TypeError: __init__() missing 1 required positional argument #2909
Comments
Did you try to call |
@awaelchli |
https://pytorch-lightning.readthedocs.io/en/latest/hyperparameters.html
|
@siahuat0727 I can confirm this is a bug. I fixed it and reduced your example to a minimal test case, so it won't break in the future. Thanks for providing a easy to reproduce script! |
Great job. Thanks!! |
The same issue appears with version 1.0.5 (0.9.0 is fine). Can you help with it? (also track_grad_norm doesn't work in 0.9.0, so I have to switch to 1.0.5...) |
Bump |
bump for version 1.0.6 as well |
same problem here on 1.0.4 |
Apparently the problem is that |
What solved it for me is that instead of passing the
Still a bug though cause the hparams method is not yet deprecated. |
@stathius yes, the "old" hparams method is not yet deprecated but it simply has conceptual flaws in terms of typing, that cannot be fixed as in a "bugfix". The solution we came up with here is to simply decouple two things:
And the code you posted is exactly doing that, and this is the recommended way today. |
what can i do if i already trained my models without calling |
@pietz in this case you can instantiate your model normally, checkpoint = torch.load(path)
model.load_state_dict(checkpoint["state_dict"] |
@awaelchli Ah, thank you. Looking back I should have been able to figure this one out myself :) |
but we sometimes have to load some other params like optimizer params, we have to use several load_* function. It is not good! |
❓ Questions and Help
What is your question?
load_from_checkpoint: TypeError: init() missing 1 required positional argument
I have read the issues before, but the things different is my
LightningModule
is inherited from my self-definedLightningModule
.How to solve this problem or what is the best practice better suited to my needs?
Code
To reproduce the error:
Error msg
How to run to get the error
$ python3 main.py $ python3 main.py --checkpoint lightning_logs/version_0/checkpoints/epoch\=1.ckpt
What's your environment?
The text was updated successfully, but these errors were encountered: