-
Notifications
You must be signed in to change notification settings - Fork 149
Conversation
Thanks for your pull request. It looks like this may be your first contribution to a Google open source project (if not, look below for help). Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). 📝 Please visit https://cla.developers.google.com/ to sign. Once you've signed (or fixed any issues), please reply here with What to do if you already signed the CLAIndividual signers
Corporate signers
ℹ️ Googlers: Go here for more info. |
@googlebot I signed it! |
CLAs look good, thanks! ℹ️ Googlers: Go here for more info. |
Thanks for contributing this @jordannad. Two general questions regarding the model as I looked into PyTorch implementations:
|
As part of ensuring the train test is reliable, I have matched the initialization of the reference implementation for the Embedding layers. Additionally, the DLRM model is susceptible to a "bad initialization" that doesn't perfectly memorize the single test minibatch. Although this is infrequent (~1 out of 50 test runs), I have modified the tests to randomly re-initialize 5 times, ensuring the test is approximately flaky with a probability of 3.2e-9 while still maintaining the quality of the test (e.g. testing random initialization, etc). Finally, instead of checking that loss drops below a particular value, the test checks that the accuracy is 100%. This results in a faster stopping condition, and thus the convergence test often runs in under 300ms on a laptop.
Please take a look? I believe I've addressed the comments. Thank you! |
Models/Recommendation/DLRM.swift
Outdated
sparseInput: [Tensor<Int32>] | ||
) -> Tensor<Float> { | ||
precondition(denseInput.shape.last! == nDense) | ||
assert(sparseInput.count == latentFactors.count) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use precondition as well ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One minor comment. Rest LGTM!
No description provided.