We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent f7d06b6 commit 12d7a22Copy full SHA for 12d7a22
intermediate_source/torchrec_intro_tutorial.py
@@ -744,7 +744,7 @@ def _wait_impl(self) -> torch.Tensor:
744
# ``EmbeddingBagCollection`` to generate a
745
# ``ShardedEmbeddingBagCollection`` module. This workflow is fine, but
746
# typically when implementing model parallel,
747
-# `DistributedModelParallel <https://pytorch.org/torchrec/torchrec.distributed.html#torchrec.distributed.model_parallel.DistributedModelParallel>`__
+# `DistributedModelParallel <https://pytorch.org/torchrec/model-parallel-api-reference.html#torchrec.distributed.model_parallel.DistributedModelParallel>`__
748
# (DMP) is used as the standard interface. When wrapping your model (in
749
# our case ``ebc``), with DMP, the following will occur:
750
#
0 commit comments