Skip to content

LearnedObjective and PairwiseGP dtype fixes and cleanup #2006

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 7 commits into from

Conversation

esantorella
Copy link
Member

Motivation

When LearnedObjective.forward is passed single-precision data, but it has a pref_model with double-precision data and an input transform, the coefficients of the input transform can be downcast from double-precision to single-precision. To prevent precision loss, and to ensure that the LearnedObjective gives the same result when called repeatedly, this PR up-casts the input data to double-precision in this situation.

Changes:

  • Cast single-precision data to double in the situation describe above
  • Single precision warning for PairwiseGP
  • Cleaned up arguments to PairwiseGP.__init__ so that they are passed explicitly rather than as part of kwargs

Have you read the Contributing Guidelines on pull requests?

Yes

Test Plan

Added units:

  • Tests for new warnings (and silenced warnings where we are intentionally passing in single-precision data for testing)
  • Test that LearnedObjective gives the same result when called multiple times
  • Minor clean-up in unit tests (e.g. assertTrue(x is y) -> assertIs(x, y))

@facebook-github-bot facebook-github-bot added the CLA Signed Do not delete this pull request or issue due to inactivity. label Sep 4, 2023
@facebook-github-bot
Copy link
Contributor

@esantorella has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@codecov
Copy link

codecov bot commented Sep 4, 2023

Codecov Report

Merging #2006 (1211a3e) into main (07eda3d) will not change coverage.
The diff coverage is 100.00%.

❗ Current head 1211a3e differs from pull request most recent head 7eab772. Consider uploading reports for the commit 7eab772 to get more accurate results

@@            Coverage Diff            @@
##              main     #2006   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files          179       179           
  Lines        15771     15782   +11     
=========================================
+ Hits         15771     15782   +11     
Files Changed Coverage Δ
botorch/models/utils/assorted.py 100.00% <ø> (ø)
botorch/acquisition/objective.py 100.00% <100.00%> (ø)
botorch/exceptions/warnings.py 100.00% <100.00%> (ø)
botorch/models/gpytorch.py 100.00% <100.00%> (ø)
botorch/models/model.py 100.00% <100.00%> (ø)
botorch/models/pairwise_gp.py 100.00% <100.00%> (ø)

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

Copy link
Contributor

@Balandat Balandat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many thanks!

Comment on lines 526 to 530
def _get_learned_objective_pref_model_mixed_dtype_warn() -> str:
return (
"pref_model has double-precision data, but single-precision data "
"was passed to the LearnedObjective. Upcasting to double."
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not make this a constant?

Comment on lines +172 to +173
datapoints: Optional[Tensor],
comparisons: Optional[Tensor],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we update the docstring with what happens when these are None?

@facebook-github-bot
Copy link
Contributor

@esantorella has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@esantorella has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@esantorella merged this pull request in 626eb41.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants