-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Misleading exception raised during batch scaling #1973
Conversation
Use batch_size from `model.hparams.batch_size` instead of `model.batch_size`
@tejasvi mind update it to reflect we new structure of arguments? |
Hello @tejasvi! Thanks for updating this PR.
Comment last updated at 2020-06-08 16:17:27 UTC |
What about something like: splitted_batch_arg_name = batch_arg_name.split('.')
contains_arg = model
for name in splitted_batch_arg_name[:-1]:
contains_arg = getattr(contains_arg, name)
batch_size = getattr(contains_arg, splitted_batch_arg_name[-1])
if value:
setattr(contains_arg, splitted_batch_arg_name[-1], value) This way one could either specify the name to something like The issue is, that arguments are not always attached and hparams can also be under other names (like |
We had a similar problem, when the learning rate finder was implemented, namely that some users like to store their learning rate in something like |
if hasattr(model, batch_arg_name): | ||
batch_size = getattr(model, batch_arg_name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we are not registering in the model anymore...
Co-authored-by: Jirka Borovec <[email protected]>
This reverts commit f8103f9.
sorry, merged by accident. mind re-opening? |
merged cannot be reopened, it is already in master |
@tejasvi mind open a new PR? |
Use batch_size from
model.hparams.batch_size
instead ofmodel.batch_size