Skip to content

Commit da6ee3a

Browse files
Balandatfacebook-github-bot
authored andcommitted
Detach tensor in gen_candidates_scipy to avoid test failure due to new warning
Summary: pytorch/pytorch#143261 added a new warning to pytorch that gets raised when converting a tensor with gradients to a scalar without detaching it first. This caused some test failues in botorch (which are likely a bit overzealous, but we can fix that separately). This makes a change in `gen_candidates_scipy` to fix this. There are most likely other occurrences of this in botorch that we should fix, but here I'm quickly addressing the test failures first. Reviewed By: saitcakmak Differential Revision: D72286925 fbshipit-source-id: f1ddf2dc841d94c8b69589b0051cfc87f384be29
1 parent a44b2aa commit da6ee3a

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

botorch/generation/gen.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@ def f_np_wrapper(x: npt.NDArray, f: Callable):
205205
if initial_conditions.dtype != torch.double:
206206
msg += " Consider using `dtype=torch.double`."
207207
raise OptimizationGradientError(msg, current_x=x)
208-
fval = loss.item()
208+
fval = loss.detach().item()
209209
return fval, gradf
210210

211211
else:
@@ -215,7 +215,7 @@ def f_np_wrapper(x: npt.NDArray, f: Callable):
215215
with torch.no_grad():
216216
X_fix = fix_features(X=X, fixed_features=fixed_features)
217217
loss = f(X_fix).sum()
218-
fval = loss.item()
218+
fval = loss.detach().item()
219219
return fval
220220

221221
if nonlinear_inequality_constraints:

0 commit comments

Comments
 (0)