Skip to content

Commit 12d7a22

Browse files
docs(torchrec_intro_tutorial): current links
1 parent f7d06b6 commit 12d7a22

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

intermediate_source/torchrec_intro_tutorial.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -744,7 +744,7 @@ def _wait_impl(self) -> torch.Tensor:
744744
# ``EmbeddingBagCollection`` to generate a
745745
# ``ShardedEmbeddingBagCollection`` module. This workflow is fine, but
746746
# typically when implementing model parallel,
747-
# `DistributedModelParallel <https://pytorch.org/torchrec/torchrec.distributed.html#torchrec.distributed.model_parallel.DistributedModelParallel>`__
747+
# `DistributedModelParallel <https://pytorch.org/torchrec/model-parallel-api-reference.html#torchrec.distributed.model_parallel.DistributedModelParallel>`__
748748
# (DMP) is used as the standard interface. When wrapping your model (in
749749
# our case ``ebc``), with DMP, the following will occur:
750750
#

0 commit comments

Comments
 (0)