Skip to content

Commit a425e63

Browse files
SiddhantRanademergify[bot]
authored and
atee
committed
Fix enforce_datamodule_dataloader_override() for iterable datasets (Lightning-AI#2957)
This function has the if statement `if (train_dataloader or val_dataloaders) and datamodule:`. The issue is similar to that in Lightning-AI#1560. The problem is that the `if(dl)` translates to `if(bool(dl))`, but there's no dataloader.__bool__ so bool() uses dataloader.__len__ > 0. But... dataloader.__len__ uses IterableDataset.__len__ for IterableDatasets for which __len__ is undefined. The fix is also the same, the `if dl` should be replaced by `if dl is not None`. Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
1 parent d0e04dc commit a425e63

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pytorch_lightning/trainer/configuration_validator.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ def __init__(self, trainer):
1010

1111
def enforce_datamodule_dataloader_override(self, train_dataloader, val_dataloaders, datamodule):
1212
# If you supply a datamodule you can't supply train_dataloader or val_dataloaders
13-
if (train_dataloader or val_dataloaders) and datamodule:
13+
if (train_dataloader is not None or val_dataloaders is not None) and datamodule is not None:
1414
raise MisconfigurationException(
1515
'You cannot pass train_dataloader or val_dataloaders to trainer.fit if you supply a datamodule'
1616
)

0 commit comments

Comments
 (0)