Skip to content

Commit 87f5e56

Browse files
committed
add docs
1 parent 35a3fd2 commit 87f5e56

File tree

1 file changed

+7
-2
lines changed

1 file changed

+7
-2
lines changed

pytorch_lightning/core/hooks.py

+7-2
Original file line numberDiff line numberDiff line change
@@ -309,8 +309,13 @@ def transfer_batch_to_device(self, batch, device)
309309
Note:
310310
This hook should only transfer the data and not modify it, nor should it move the data to
311311
any other device than the one passed in as argument (unless you know what you are doing).
312-
The :class:`~pytorch_lightning.trainer.trainer.Trainer` already takes care of splitting the
313-
batch and determines the target devices.
312+
313+
Note:
314+
This hook only runs on single GPU training (no data-parallel). If you need multi-GPU support
315+
for your custom batch objects, you need to define your custom
316+
:class:`~torch.nn.parallel.DistributedDataParallel` or
317+
:class:`~pytorch_lightning.overrides.data_parallel.LightningDistributedDataParallel` and
318+
override :meth:`~pytorch_lightning.core.lightning.LightningModule.configure_ddp`.
314319
315320
See Also:
316321
- :func:`~pytorch_lightning.utilities.apply_func.move_data_to_device`

0 commit comments

Comments
 (0)