-
Notifications
You must be signed in to change notification settings - Fork 23.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SummaryWriter add_hparams
should support adding new hyperparameters
#39250
Comments
I am experiencing this exact bug. Any news about that? |
The bug still persists today. tensorboard 2.5.0 |
The issue still persists and is quite annoying when comparing multiple runs with different hyperparameter setups. Is there any plan to fix this or is there a known workaround? |
I believe the issue has been fixed. On the left, you need to scroll down and check the appropriate boxes. It will default to a smaller subset if you change the parameter sets. |
This still persists for me as well.
|
🐛 Bug
When calling
SummaryWriter().add_hparams
with new hyperparameters, keys that do not exist in the first call do not appear in the hyperparameter dashboard output.To Reproduce
When viewing the Tensorboard summary writer output on
http://localhost:6006/#hparams
:Expected behavior
I would expect
key_B
to also appear in the output, with a blank value for the first row.Environment
PyTorch version: 1.4.0
Is debug build: No
CUDA used to build PyTorch: 10.1
OS: Debian GNU/Linux 10 (buster)
GCC version: (Debian 8.3.0-6) 8.3.0
CMake version: Could not collect
Python version: 3.7
Is CUDA available: No
CUDA runtime version: No CUDA
GPU models and configuration: No CUDA
Nvidia driver version: No CUDA
cuDNN version: No CUDA
Versions of relevant libraries:
[pip3] numpy==1.17.4
[pip3] torch==1.4.0
[pip3] torchvision==0.5.0
[conda] Could not collect
Additional context
The text was updated successfully, but these errors were encountered: