Skip to content

Commit b4a0413

Browse files
williamFalconBorda
authored and
akarnachev
committed
added warnings to unimplemented methods (Lightning-AI#1317)
* added warnings and removed default optimizer * opt * Apply suggestions from code review Co-authored-by: Jirka Borovec <[email protected]>
1 parent 321d9af commit b4a0413

File tree

3 files changed

+17
-2
lines changed

3 files changed

+17
-2
lines changed

CHANGELOG.md

+3-1
Original file line numberDiff line numberDiff line change
@@ -39,8 +39,10 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
3939
- Added model configuration checking ([#1199](https://github.com/PyTorchLightning/pytorch-lightning/pull/1199))
4040
- On DP and DDP2 unsqueeze is automated now ([#1319](https://github.com/PyTorchLightning/pytorch-lightning/pull/1319))
4141
- Does not interfere with a default sampler ([#1318](https://github.com/PyTorchLightning/pytorch-lightning/pull/1318))
42+
- Remove default Adam optimizer ([#1317](https://github.com/PyTorchLightning/pytorch-lightning/pull/1317))
43+
- Give warnings for unimplemented required lightning methods ([#1317](https://github.com/PyTorchLightning/pytorch-lightning/pull/1317))
4244
- Enhanced load_from_checkpoint to also forward params to the model ([#1307](https://github.com/PyTorchLightning/pytorch-lightning/pull/1307))
43-
- Made `evalaute` method private >> `Trainer._evaluate(...)`. ([#1260](https://github.com/PyTorchLightning/pytorch-lightning/pull/1260))
45+
- Made `evaluate` method private >> `Trainer._evaluate(...)`. ([#1260](https://github.com/PyTorchLightning/pytorch-lightning/pull/1260))
4446

4547
### Deprecated
4648

docs/source/introduction_guide.rst

+11-1
Original file line numberDiff line numberDiff line change
@@ -269,7 +269,6 @@ In PyTorch we do it as follows:
269269
270270
271271
In Lightning we do the same but organize it under the configure_optimizers method.
272-
If you don't define this, Lightning will automatically use `Adam(self.parameters(), lr=1e-3)`.
273272

274273
.. code-block:: python
275274
@@ -278,6 +277,17 @@ If you don't define this, Lightning will automatically use `Adam(self.parameters
278277
def configure_optimizers(self):
279278
return Adam(self.parameters(), lr=1e-3)
280279
280+
.. note:: The LightningModule itself has the parameters, so pass in self.parameters()
281+
282+
However, if you have multiple optimizers use the matching parameters
283+
284+
.. code-block:: python
285+
286+
class LitMNIST(pl.LightningModule):
287+
288+
def configure_optimizers(self):
289+
return Adam(self.generator(), lr=1e-3), Adam(self.discriminator(), lr=1e-3)
290+
281291
Training step
282292
^^^^^^^^^^^^^
283293

pytorch_lightning/core/lightning.py

+3
Original file line numberDiff line numberDiff line change
@@ -224,6 +224,7 @@ def training_step(self, batch, batch_idx, hiddens):
224224
The presented loss value in progress bar is smooth (average) over last values,
225225
so it differs from values set in train/validation step.
226226
"""
227+
warnings.warn('`training_step` must be implemented to be used with the Lightning Trainer')
227228

228229
def training_end(self, *args, **kwargs):
229230
"""
@@ -1079,6 +1080,7 @@ def configure_optimizers(self):
10791080
}
10801081
10811082
"""
1083+
warnings.warn('`configure_optimizers` must be implemented to be used with the Lightning Trainer')
10821084

10831085
def optimizer_step(
10841086
self,
@@ -1280,6 +1282,7 @@ def train_dataloader(self):
12801282
return loader
12811283
12821284
"""
1285+
warnings.warn('`train_dataloader` must be implemented to be used with the Lightning Trainer')
12831286

12841287
def tng_dataloader(self): # todo: remove in v1.0.0
12851288
"""Implement a PyTorch DataLoader.

0 commit comments

Comments
 (0)