Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to configure_optimizers from dictionary without lr_scheduler field presented #1442

Closed
alexeykarnachev opened this issue Apr 10, 2020 · 1 comment · Fixed by #1443
Closed
Labels
bug Something isn't working help wanted Open to be worked on

Comments

@alexeykarnachev
Copy link
Contributor

alexeykarnachev commented Apr 10, 2020

🐛 Bug

Optimizer is failed to be configured from the dictionary without lr_sheduler field.
Consider an example of the Module configure_optimizers method:

        def configure_optimizers(self):
            config = {
                'optimizer': torch.optim.SGD(params=self.parameters(), lr=1e-03)
            }
            return config

Then, we run a simple trainer:

    trainer_options = dict(default_save_path=tmpdir, max_epochs=1)
    trainer = Trainer(**trainer_options)
    _ = trainer.fit(model)

And we fail with an error:

UnboundLocalError: local variable 'lr_schedulers' referenced before assignment

I believe, that the reason is that lr_schedulers local variable is not determined here:
https://github.com/PyTorchLightning/pytorch-lightning/blob/8dd9b80d7a192117783195a748ddce6c33d556f3/pytorch_lightning/trainer/optimizers.py#L36-L42

I think, it could be fixed like this:

        # single dictionary
        elif isinstance(optim_conf, dict):
            optimizer = optim_conf["optimizer"]
            lr_scheduler = optim_conf.get("lr_scheduler", [])
            if lr_scheduler:
                lr_schedulers = self.configure_schedulers([lr_scheduler])
            else:
                lr_schedulers = []
            return [optimizer], lr_schedulers, []

To Reproduce

Steps to reproduce the behavior:

  1. Create a simple Module with configure_optimizers which looks like above.
  2. Run the fit Trainer method with the model.
  3. See error

Code sample

https://gist.github.com/alexeykarnachev/c61a5b1ca3bf876e19b4547eeb9f42dc

Expected behavior

I suppose, that such the configuration: {"optimizer": ...}, without "lr_scheduler" must be a valid one, and this error must not be occurred.

Environment

OS: Linux
architecture: 64bit
processor: x86_64
python: 3.7.6
version: #97~16.04.1-Ubuntu SMP Wed Apr 1 03:03:31 UTC 2020

pytorch-lightning: 0.7.3rc1

@alexeykarnachev alexeykarnachev added bug Something isn't working help wanted Open to be worked on labels Apr 10, 2020
@williamFalcon
Copy link
Contributor

yeah, agreed that dict should work without the scheduler. mind submitting a PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Open to be worked on
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants