-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DDP breaks LR finder #1831
Labels
Comments
I also face a similar issue with Tensorboard logger whenever the logger flag is left as default both on GPU and TPU colab runtime. It throws the following exception on TPU runtime
Similarly, on GPU runtime it throws an exception saying |
5 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
🐛 Bug
DDP breaks LR finder
To Reproduce
At first I thought it's because
configure_optimizers
returns[opt], [sched]
but returningopt
still causes the error. Training works correctly with the same code.The text was updated successfully, but these errors were encountered: