We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connected to #1495. When using trainer.tune() only the model is forwarded, but no the datamodule cp.: https://github.com/PyTorchLightning/pytorch-lightning/blob/eafec7d425bf691ab5bfaf2794c3e581487ecfa8/pytorch_lightning/tuner/tuning.py#L64
model
Just like for self.trainer.auto_scale_batch_size the datamodule or train/val dataloaders should be forwarded as well. I.e.:
self.trainer.auto_scale_batch_size
self.lr_find( model, train_dataloader=train_dataloader, val_dataloaders=val_dataloaders, datamodule=datamodule, ...)
The text was updated successfully, but these errors were encountered:
Fixed by #6784
Sorry, something went wrong.
No branches or pull requests
🐛 Bug
Connected to #1495.
When using trainer.tune() only the
model
is forwarded, but no the datamodule cp.: https://github.com/PyTorchLightning/pytorch-lightning/blob/eafec7d425bf691ab5bfaf2794c3e581487ecfa8/pytorch_lightning/tuner/tuning.py#L64Expected behavior
Just like for
self.trainer.auto_scale_batch_size
the datamodule or train/val dataloaders should be forwarded as well.I.e.:
The text was updated successfully, but these errors were encountered: