Skip to content

Commit 3ee3c42

Browse files
carmoccaBorda
authored andcommitted
Prepare 1.1.3 release (#5365)
* Prepare 1.1.3 release * Fix flake8 error * suppress * Remove 1.1.4 section * Add missing commits to CHANGELOG * Update PR template * Add missing commit * fix * Update CHANGELOG.md * Apply suggestions from code review * Apply suggestions from code review Co-authored-by: Jirka Borovec <[email protected]> Co-authored-by: Jirka Borovec <[email protected]> (cherry picked from commit 4d9db86)
1 parent 21fd56e commit 3ee3c42

File tree

5 files changed

+31
-30
lines changed

5 files changed

+31
-30
lines changed

.github/PULL_REQUEST_TEMPLATE.md

+8-8
Original file line numberDiff line numberDiff line change
@@ -16,26 +16,26 @@ If we didn't discuss your PR in Github issues there's a high chance it will not
1616
Fixes # (issue) <- this [links related issue to this PR](https://docs.github.com/en/free-pro-team@latest/github/managing-your-work-on-github/linking-a-pull-request-to-an-issue#linking-a-pull-request-to-an-issue-using-a-keyword)
1717

1818
## Before submitting
19-
- [ ] Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
20-
- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md), Pull Request section?
19+
- [ ] Was this discussed/approved via a GitHub issue? (not for typos and docs)
20+
- [ ] Did you read the [contributor guideline](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/.github/CONTRIBUTING.md), **Pull Request** section?
2121
- [ ] Did you make sure your PR does only one thing, instead of bundling different changes together?
22-
- [ ] Did you make sure to update the documentation with your changes [if needed]?
23-
- [ ] Did you write any new necessary tests [no need for typos, docs]?
22+
- [ ] Did you make sure to update the documentation with your changes? (if necessary)
23+
- [ ] Did you write any new necessary tests? (not for typos and docs)
2424
- [ ] Did you verify new and existing tests pass locally with your changes?
25-
- [ ] If you made a notable change (that affects users), did you update the [CHANGELOG](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)?
25+
- [ ] Did you update the [CHANGELOG](https://github.com/PyTorchLightning/pytorch-lightning/blob/master/CHANGELOG.md)? (not for typos, docs, test updates, or internal minor changes/refactorings)
2626

2727
<!-- For CHANGELOG separate each item in the unreleased section by a blank line to reduce collisions -->
2828

2929
## PR review
3030
Anyone in the community is free to review the PR once the tests have passed.
31-
Before you start reviewing make sure you have read [Review guidelines](https://github.com/PyTorchLightning/pytorch-lightning/wiki/Review-guidelines). In short, see the following bullet-list:
31+
Before you start reviewing make sure you have read [Review guidelines](https://github.com/PyTorchLightning/pytorch-lightning/wiki/Review-guidelines). In short, see the following bullet-list:
3232

3333
- [ ] Is this pull request ready for review? (if not, please submit in draft mode)
3434
- [ ] Check that all items from **Before submitting** are resolved
3535
- [ ] Make sure the title is self-explanatory and the description concisely explains the PR
3636
- [ ] Add labels and milestones (and optionally projects) to the PR so it can be classified
37-
- [ ] **Check that target branch and milestone are aligned!**
38-
37+
- [ ] **Check that target branch and milestone match!**
38+
3939

4040
## Did you have fun?
4141
Make sure you had fun coding 🙃

CHANGELOG.md

+13-16
Original file line numberDiff line numberDiff line change
@@ -60,28 +60,25 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
6060

6161
### Added
6262

63-
- Added a check for optimizer attached to lr_scheduler ([#5338](https://github.com/PyTorchLightning/pytorch-lightning/pull/5338))
64-
65-
- Added `resume_from_checkpoint` accept non-existing file path ([#4402](https://github.com/PyTorchLightning/pytorch-lightning/pull/4402))
66-
63+
- Added a check for optimizer attached to `lr_scheduler` ([#5338](https://github.com/PyTorchLightning/pytorch-lightning/pull/5338))
64+
- Added support for passing non-existing filepaths to `resume_from_checkpoint` ([#4402](https://github.com/PyTorchLightning/pytorch-lightning/pull/4402))
6765

6866
### Changed
6967

70-
71-
### Deprecated
72-
73-
74-
### Removed
75-
76-
77-
### Fixed
78-
79-
- Skip restore from `resume_from_checkpoint` in while `testing` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161))
80-
68+
- Skip restore from `resume_from_checkpoint` while `testing` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161))
8169
- Allowed `log_momentum` for adaptive optimizers in `LearningRateMonitor` ([#5333](https://github.com/PyTorchLightning/pytorch-lightning/pull/5333))
70+
- Disabled checkpointing, earlystopping and logging with `fast_dev_run` ([#5277](https://github.com/PyTorchLightning/pytorch-lightning/pull/5277))
71+
- Distributed group defaults to `WORLD` if `None` ([#5125](https://github.com/PyTorchLightning/pytorch-lightning/pull/5125))
8272

83-
- Disabled checkpointing, earlystopping and logger with `fast_dev_run` ([#5277](https://github.com/PyTorchLightning/pytorch-lightning/pull/5277))
73+
### Fixed
8474

75+
- Fixed `trainer.test` returning non-test metrics ([#5214](https://github.com/PyTorchLightning/pytorch-lightning/pull/5214))
76+
- Fixed metric state reset ([#5273](https://github.com/PyTorchLightning/pytorch-lightning/pull/5273))
77+
- Fixed `--num-nodes` on `DDPSequentialPlugin` ([#5327](https://github.com/PyTorchLightning/pytorch-lightning/pull/5327))
78+
- Fixed invalid value for `weights_summary` ([#5296](https://github.com/PyTorchLightning/pytorch-lightning/pull/5296))
79+
- Fixed `Trainer.test` not using the latest `best_model_path` ([#5161](https://github.com/PyTorchLightning/pytorch-lightning/pull/5161))
80+
- Fixed existence check for hparams not using underlying filesystem ([#5250](https://github.com/PyTorchLightning/pytorch-lightning/pull/5250))
81+
- Fixed `LightningOptimizer` AMP bug ([#5191](https://github.com/PyTorchLightning/pytorch-lightning/pull/5191))
8582
- Fixed casted key to string in `_flatten_dict` ([#5354](https://github.com/PyTorchLightning/pytorch-lightning/pull/5354))
8683

8784

pl_examples/basic_examples/mnist_datamodule.py

+4-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14-
14+
import platform
1515
from typing import Optional
1616

1717
from torch.utils.data import DataLoader, random_split
@@ -55,6 +55,9 @@ def __init__(
5555
normalize: If true applies image normalize
5656
"""
5757
super().__init__(*args, **kwargs)
58+
if platform.system() == "Windows":
59+
# see: https://stackoverflow.com/a/59680818/4521646
60+
num_workers = 0
5861

5962
self.dims = (1, 28, 28)
6063
self.data_dir = data_dir

pytorch_lightning/plugins/rpc_plugin.py

+3-2
Original file line numberDiff line numberDiff line change
@@ -12,18 +12,19 @@
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
1414
import os
15+
from contextlib import suppress
1516
from typing import Optional
1617

1718
import torch
1819

1920
from pytorch_lightning.core.lightning import LightningModule
2021
from pytorch_lightning.plugins.ddp_plugin import DDPPlugin
21-
from pytorch_lightning.utilities import _RPC_AVAILABLE, _module_available
22+
from pytorch_lightning.utilities import _RPC_AVAILABLE
2223

2324
DEFAULT_RPC_TIMEOUT_SEC = 60.
2425
if _RPC_AVAILABLE:
2526
from torch.distributed import rpc
26-
if _module_available("torch.distributed.rpc.constants") and hasattr(torch.distributed.rpc.constants, "DEFAULT_RPC_TIMEOUT_SEC"):
27+
with suppress(ModuleNotFoundError, ImportError):
2728
from torch.distributed.rpc.constants import DEFAULT_RPC_TIMEOUT_SEC
2829

2930

tests/checkpointing/test_model_checkpoint.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -11,20 +11,20 @@
1111
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212
# See the License for the specific language governing permissions and
1313
# limitations under the License.
14-
from argparse import Namespace
1514
import os
16-
from pathlib import Path
1715
import pickle
1816
import platform
1917
import re
18+
from argparse import Namespace
19+
from pathlib import Path
2020
from unittest import mock
2121
from unittest.mock import Mock
2222

2323
import cloudpickle
24-
from omegaconf import Container, OmegaConf
2524
import pytest
2625
import torch
2726
import yaml
27+
from omegaconf import Container, OmegaConf
2828

2929
import pytorch_lightning as pl
3030
import tests.base.develop_utils as tutils

0 commit comments

Comments
 (0)