We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimizer is failed to be configured from the dictionary without lr_sheduler field. Consider an example of the Module configure_optimizers method:
lr_sheduler
configure_optimizers
def configure_optimizers(self): config = { 'optimizer': torch.optim.SGD(params=self.parameters(), lr=1e-03) } return config
Then, we run a simple trainer:
trainer_options = dict(default_save_path=tmpdir, max_epochs=1) trainer = Trainer(**trainer_options) _ = trainer.fit(model)
And we fail with an error:
UnboundLocalError: local variable 'lr_schedulers' referenced before assignment
I believe, that the reason is that lr_schedulers local variable is not determined here: https://github.com/PyTorchLightning/pytorch-lightning/blob/8dd9b80d7a192117783195a748ddce6c33d556f3/pytorch_lightning/trainer/optimizers.py#L36-L42
lr_schedulers
I think, it could be fixed like this:
# single dictionary elif isinstance(optim_conf, dict): optimizer = optim_conf["optimizer"] lr_scheduler = optim_conf.get("lr_scheduler", []) if lr_scheduler: lr_schedulers = self.configure_schedulers([lr_scheduler]) else: lr_schedulers = [] return [optimizer], lr_schedulers, []
Steps to reproduce the behavior:
fit
https://gist.github.com/alexeykarnachev/c61a5b1ca3bf876e19b4547eeb9f42dc
I suppose, that such the configuration: {"optimizer": ...}, without "lr_scheduler" must be a valid one, and this error must not be occurred.
OS: Linux architecture: 64bit processor: x86_64 python: 3.7.6 version: #97~16.04.1-Ubuntu SMP Wed Apr 1 03:03:31 UTC 2020
pytorch-lightning: 0.7.3rc1
The text was updated successfully, but these errors were encountered:
yeah, agreed that dict should work without the scheduler. mind submitting a PR?
Sorry, something went wrong.
Successfully merging a pull request may close this issue.
🐛 Bug
Optimizer is failed to be configured from the dictionary without
lr_sheduler
field.Consider an example of the Module
configure_optimizers
method:Then, we run a simple trainer:
And we fail with an error:
I believe, that the reason is that
lr_schedulers
local variable is not determined here:https://github.com/PyTorchLightning/pytorch-lightning/blob/8dd9b80d7a192117783195a748ddce6c33d556f3/pytorch_lightning/trainer/optimizers.py#L36-L42
I think, it could be fixed like this:
To Reproduce
Steps to reproduce the behavior:
configure_optimizers
which looks like above.fit
Trainer method with the model.Code sample
https://gist.github.com/alexeykarnachev/c61a5b1ca3bf876e19b4547eeb9f42dc
Expected behavior
I suppose, that such the configuration: {"optimizer": ...}, without "lr_scheduler" must be a valid one, and this error must not be occurred.
Environment
OS: Linux
architecture: 64bit
processor: x86_64
python: 3.7.6
version: #97~16.04.1-Ubuntu SMP Wed Apr 1 03:03:31 UTC 2020
pytorch-lightning: 0.7.3rc1
The text was updated successfully, but these errors were encountered: