-
Notifications
You must be signed in to change notification settings - Fork 322
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[GENERAL SUPPORT]: Implementation of Evolution-guided BO #3198
Comments
If all you need access to is the training data (Xs, Ys), you should be able to get this from Surrogate.training_data (
Acquisition.__init__ (
Acquisition similar to SEBO and then store the training data on the Acquisition object, so that you can access it during optimize .
|
Thank you I'll test this! |
I reopen this issue for a while just to share what I implemented in case it is of interest to anyone using Ax. I tried to mimic the Ax format in SEBO so I hope it is readable enough for people.
|
Question
I was thinking of implementing the Evolution-guided BO as described in this paper and I thought that it would make sense to write something with a similar structure to the
SEBOAcquisition
class and with a.optimize
similar to the EGBO implemented in this repo.To do so, I need to pull back the untransformed X_observed and the corresponding metrics.
I went down the rabbit hole and explored the attributes of many of the objects passed to
SEBOAcquisition
but could not find what I needed. Any ideas?I know I could use the original codes here but I was hoping to write something that is more 'Ax-like'.
For context, I print out below the code from the repo implementing the EGBO.
Please provide any relevant code snippet if applicable.
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: