-
Notifications
You must be signed in to change notification settings - Fork 419
LearnedObjective and PairwiseGP dtype fixes and cleanup #2006
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@esantorella has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
Codecov Report
@@ Coverage Diff @@
## main #2006 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 179 179
Lines 15771 15782 +11
=========================================
+ Hits 15771 15782 +11
📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Many thanks!
botorch/acquisition/objective.py
Outdated
def _get_learned_objective_pref_model_mixed_dtype_warn() -> str: | ||
return ( | ||
"pref_model has double-precision data, but single-precision data " | ||
"was passed to the LearnedObjective. Upcasting to double." | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not make this a constant?
datapoints: Optional[Tensor], | ||
comparisons: Optional[Tensor], |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we update the docstring with what happens when these are None
?
Co-authored-by: Max Balandat <[email protected]>
@esantorella has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@esantorella has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator. |
@esantorella merged this pull request in 626eb41. |
Motivation
When
LearnedObjective.forward
is passed single-precision data, but it has apref_model
with double-precision data and an input transform, the coefficients of the input transform can be downcast from double-precision to single-precision. To prevent precision loss, and to ensure that theLearnedObjective
gives the same result when called repeatedly, this PR up-casts the input data to double-precision in this situation.Changes:
PairwiseGP
PairwiseGP.__init__
so that they are passed explicitly rather than as part ofkwargs
Have you read the Contributing Guidelines on pull requests?
Yes
Test Plan
Added units:
LearnedObjective
gives the same result when called multiple timesassertTrue(x is y)
->assertIs(x, y)
)