Skip to content

Commit 5651a25

Browse files
amoudgltullie
authored andcommitted
Backward compatibility for checkpoint loading (Lightning-AI#1132)
* check if hparams_type exists in checkpoint dictionary for backward compatibility * concisely maintain backward compatibility for hparams type * Bug fix in checkpoint loading (Lightning-AI#1132)
1 parent c35a6bf commit 5651a25

File tree

2 files changed

+2
-1
lines changed

2 files changed

+2
-1
lines changed

CHANGELOG.md

+1
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
2929
### Fixed
3030

3131
- Fixed bug related to type cheking of `ReduceLROnPlateau` lr schedulers([#1114](https://github.com/PyTorchLightning/pytorch-lightning/issues/1114))
32+
- Fixed a bug to ensure lightning checkpoints to be backward compatible ([#1132](https://github.com/PyTorchLightning/pytorch-lightning/pull/1132))
3233

3334
## [0.7.1] - 2020-03-07
3435

pytorch_lightning/core/lightning.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -1396,7 +1396,7 @@ def _load_model_state(cls, checkpoint: Dict[str, Any]) -> 'LightningModule':
13961396

13971397
if cls_takes_hparams:
13981398
if ckpt_hparams is not None:
1399-
is_namespace = checkpoint.get('hparams_type') == 'namespace'
1399+
is_namespace = checkpoint.get('hparams_type', 'namespace') == 'namespace'
14001400
hparams = Namespace(**ckpt_hparams) if is_namespace else ckpt_hparams
14011401
else:
14021402
warnings.warn(

0 commit comments

Comments
 (0)