-
Notifications
You must be signed in to change notification settings - Fork 415
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] Variational Bayesian last layer models as surrogate models #2754
base: main
Are you sure you want to change the base?
Conversation
Edit: max also responded. As he mentions, dependencies are a pain. Since this isn’t much code it makes sense to copy over and we can always reconsider if there is substantial new functionality and more extensive unit tests.
Thanks for this contribution, Paul! This was a very intriguing paper and
it’s great to see it in botch.
We typically don’t take on new external dependencies but if it’s pure
PyTorch that should be OK,, as long as it’s possible for your package to
have some sort of unit or integration test that ensures any future changes
to your package do not cause problems for the botch implementation/
functionality demonstrated in the nb.
…On Fri, Feb 21, 2025 at 9:57 AM Paul Brunzema ***@***.***> wrote:
Motivation
This PR adds variational Bayesian last layers (VBLLs) [1], which
demonstrated pre very promising results in the context of BO in our last
paper [2], to BoTorch. The goal is to provide a BoTorch-compatible
implementation of VBLL surrogates for standard use cases (single-output
models), making them accessible to the community as quickly as possible.
This PR does not yet contain all the features discussed in [2] such as the
continual learning. If there is the interest to also add the continual
learning, I am happy to add them down the line!
The VBLLs can be used in standard acquisition functions such as (log)EI
but are especially nice for Thompson sampling as the Thompson sample of a
Bayesian last layer model is a differentiable standard feed forward neural
network which is useful for (almost) global optimization of the sample for
the next query location.
Implementation details
This PR adds the implementation to the community folders--also here, if
there is a large interest in the model, I am happy to help merge them into
the main part of the repo. The added files of this PR are the following
botorch_community
|-- acquisition
| |-- bll_thompson_sampling.py # TS for Bayesian last layer models
|-- models
| |-- vblls.py # BoTorch wrapper for VBLLs
|-- posteriors
| |-- bll_posterior.py # Posterior class for Bayesian last layer models
notebooks_community
|-- vbll_thompson_sampling.ipynb # Tutorial on how use the VBLL model
test_community
|-- models
| |-- test_vblls.py # test for the VBLL models functionality (backbone freezing for feature reuse, etc)
The current implementation build directly on the VBLL repo
<https://github.com/VectorInstitute/vbll>, which is actively maintained
and depends only on PyTorch. Using this repo allows improvements—e.g.,
better variational posterior initialization—to be directly beneficial for
BO.
Have you read the Contributing Guidelines on pull requests
<https://github.com/pytorch/botorch/blob/main/CONTRIBUTING.md#pull-requests>
?
Yes.
Test Plan
The PR does not change any functionality of the current code base. The
core functionality of the VBLLs should be covered by test_vblls.py. Let
me know if further tests are required.
Related PRs
This PR does not change functionality and I did not see any PRs regarding
last layer models in BoTorch. Maybe this implementation can useful also for
other BLLs.
References
[1] P. Brunzema, M. Jordahn, J. Willes, S. Trimpe, J. Snoek, J. Harrison. Bayesian
Optimization via Continual Variational Last Layer Training
<https://arxiv.org/abs/2412.09477>. International Conference on Learning
Representations (ICLR), 2025.
[2] J. Harrison, J. Willes, J. Snoek. Variational Bayesian Last Layers
<https://arxiv.org/abs/2404.11599>. International Conference on Learning
Representations (ICLR), 2024.
------------------------------
You can view, comment on, or merge this pull request online at:
#2754
Commit Summary
- f599471
<f599471>
add vbll surrogate and notebook
- 6b82184
<6b82184>
update notebook and base implementation
- 1b0f899
<1b0f899>
update optim config
- 2606ac3
<2606ac3>
add tests for vbll model
- 873163f
<873163f>
update tutorial for vblls
File Changes
(6 files <https://github.com/pytorch/botorch/pull/2754/files>)
- *A* botorch_community/acquisition/bll_thompson_sampling.py
<https://github.com/pytorch/botorch/pull/2754/files#diff-e1d4fb223629e80cd1469b9e3965e95cbde8c7cd3aabc2a588e7a859c75fd8d4>
(133)
- *A* botorch_community/models/vblls.py
<https://github.com/pytorch/botorch/pull/2754/files#diff-7c675eabda341230ee44f210363a77bc8ec06e8a1f5d905bcec7a614eef78502>
(418)
- *A* botorch_community/posteriors/__init__.py
<https://github.com/pytorch/botorch/pull/2754/files#diff-c655d31ec372b021b283dd2eb9469ffe897917b48cb753120689904c029241d2>
(4)
- *A* botorch_community/posteriors/bll_posterior.py
<https://github.com/pytorch/botorch/pull/2754/files#diff-a45c3f4d1096e4c276c5269ed330a44650361f816d5e5830eb29d7a0b38adf94>
(54)
- *A* notebooks_community/vbll_thompson_sampling.ipynb
<https://github.com/pytorch/botorch/pull/2754/files#diff-4faf6dfab200cc0e3b077731b3438146ce836eecef404a848be04a27aa5b3020>
(803)
- *A* test_community/models/test_vblls.py
<https://github.com/pytorch/botorch/pull/2754/files#diff-17cdea2ea479a1ee1616e238b6b1a155df7a92b3223d281ce7f37d4beb9cda61>
(186)
Patch Links:
- https://github.com/pytorch/botorch/pull/2754.patch
- https://github.com/pytorch/botorch/pull/2754.diff
—
Reply to this email directly, view it on GitHub
<#2754>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAAW34NCIDINOF67WZXXLGL2Q45FJAVCNFSM6AAAAABXTNRYUOVHI2DSMVQWIX3LMV43ASLTON2WKOZSHA3DSMRWGIYDIMQ>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Thanks for putting this up, @brunzema. The notebook looks great, and I plan to review this PR in more detail over the next day or two. Regarding the dependency on vbll: Right now it looks like the code only uses ~120 lines of pure torch code from the vbll repo (namely this My preference would be to move the relevant pieces of the vbll code mentioned above into a helper module (and clearly attribute the source there, of course) so we can avoid the dependency for now. If we do end up expanding the functionality and use additional features from vbll, then I'd be happy to reconsider (provided the vbll repo adds proper unit tests and a CI setup). |
Motivation
This PR adds variational Bayesian last layers (VBLLs) [1], which demonstrated very promising results in the context of BO in our last paper [2], to BoTorch. The goal is to provide a BoTorch-compatible implementation of VBLL surrogates for standard use cases (single-output models), making them accessible to the community as quickly as possible. This PR does not yet contain all the features discussed in [2] such as the continual learning. If there is the interest to also add the continual learning, I am happy to add them down the line!
The VBLLs can be used in standard acquisition functions such as (log)EI but are especially nice for Thompson sampling as the Thompson sample of a Bayesian last layer model is a differentiable standard feed forward neural network which is useful for (almost) global optimization of the sample for the next query location.
Implementation details
This PR adds the implementation to the community folders--also here, if there is a large interest in the model, I am happy to help merge them into the main part of the repo. The added files of this PR are the following
The current implementation build directly on the VBLL repo, which is actively maintained and depends only on PyTorch. Using this repo allows improvements—e.g., better variational posterior initialization—to be directly beneficial for BO.
Have you read the Contributing Guidelines on pull requests?
Yes.
Test Plan
The PR does not change any functionality of the current code base. The core functionality of the VBLLs should be covered by
test_vblls.py
. Let me know if further tests are required.Related PRs
This PR does not change functionality and I did not see any PRs regarding last layer models in BoTorch. Maybe this implementation can useful also for other BLLs.
References
[1] P. Brunzema, M. Jordahn, J. Willes, S. Trimpe, J. Snoek, J. Harrison. Bayesian Optimization via Continual Variational Last Layer Training. International Conference on Learning Representations (ICLR), 2025.
[2] J. Harrison, J. Willes, J. Snoek. Variational Bayesian Last Layers. International Conference on Learning Representations (ICLR), 2024.