-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Regression tutorial #2
Comments
@ucl-jhr I am working on this and I was wondering:
|
Great, thanks a lot for your efforts. Regarding your questions.
Let me know if you have a draft that you want me to look at, I think most of this is somewhat context-dependent. |
Yes, that would be very helpful. I just made a pull request #7 with a draft of the tutorial written in a notebook. I essentially summarized parts of the paper, parts of the documentation and reorganized the example. I added a really very short introduction to bayesian neural networks but we can remove it if you think it is not necessary. Please let me know if this is what you had in mind and if it wasn't I would be happy to change it to whatever you see fit. Additionally, I was also not completely sure how we are planning on integrating them so I just put them in the examples but that is easily fixable. You can also watch a rendered version of it my fork. |
Thanks, this looks like a really good start already.
I think it's nice to have this, but perhaps @karalets disagrees (and surely has further feedback). I would just rework the points with the disadvantages of deterministic networks/motivation for BNNs a little bit and keep them a bit simpler, in particular not mention overfitting (since that is a fairly complicated discussion in the context of NNs) and instead just link a couple of papers, e.g. on overconfidence and forgetting.
My plan was having the tutorials in A couple of smaller points:
Thanks again for your work on the tutorials, it's much appreciated :-) |
I need to block some time to have a look, will do so before Sunday!
Thank you for all the movement here, meanwhile.
…On Thu, Oct 21, 2021 at 1:51 PM ucl-jhr ***@***.***> wrote:
Thanks, this looks like a really good start already.
I added a really very short introduction to bayesian neural networks but
we can remove it if you think it is not necessary.
I think it's nice to have this, but perhaps @karalets
<https://github.com/karalets> disagrees (and surely has further
feedback). I would just rework the points with the disadvantages of
deterministic networks/motivation for BNNs a little bit and keep them a bit
simpler, in particular not mention overfitting (since that is a fairly
complicated discussion in the context of NNs) and instead just link a
couple of papers, e.g. on overconfidence
<https://arxiv.org/abs/1706.04599> and forgetting
<https://arxiv.org/abs/1312.6211>.
I was also not completely sure how we are planning on integrating them
My plan was having the tutorials in docs/source, so ideally you would
move the contents of the notebook into a file called
docs/source/tutorials.regression.rst so that it appears on the readthedocs
page <https://tyxe.readthedocs.io/en/latest/tutorials.html>. That would
also make it easier to discuss smaller points around phrasings, typos etc
in the PR. If I set everything up correctly, you should be able (after
installing the dependencies) to build the docs (make html from inside docs)
and view them (python -m http.server) locally.
I hope this won't be too much of a headache, let me know if you run into
any issues. One option would also be to remove all boilerplate code from
the .rst tutorial and replace the regression notebook in examples/ with a
fully runnable version of the tutorial, although that would of course be a
little bit more work.
A couple of smaller points:
- I think it might be better to not position MCMC as one of two
reasonable options for inference, but be clear that the focus at this time
lies on SVI. As far as I know (although this might be outdated), pyro
doesn't implement any SGMCMC methods (and neither does TyXe - this would
obviously be a great addition if this is something that interests you), so
it really is an option only for small datasets and networks. So I think it
would be best to only introduce MCMC towards the end of the tutorial as
more accurate inference method for small problems.
- I wouldn't make the call to .fit inside a local_reparameterization
context straight away but only introduce it as a variance reduction method
after having shown predict for the first time. If you plot the ELBOs
both with and without local reparameterization the difference is quite
noticeable if I remember correctly.
- When creating the optimizer, it would be good to stress that it has
to be a pyro optimizer, not a standard pytorch one.
Thanks again for your work on the tutorials, it's much appreciated :-)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#2 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAKWCL7K5WQRXQ6OV3GVNHTUIB4OBANCNFSM5FCCHKCQ>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
@ucl-jhr Thank you a lot for your comments. I will switch the document to .rst and put it under the docs. I will also try to do both versions (one without a boilerplate and the other runnable one in the With regards to the local parametrization, I'll mention the |
I added commits reflecting the changes we talked about. Let me know what you think. |
Anything related to the basic regression tutorial.
@pgarimidi @nb2838
The text was updated successfully, but these errors were encountered: