-
Notifications
You must be signed in to change notification settings - Fork 17
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect results when sampling from the prior #90
Comments
Hi @rlouf I've looked into this a bit more and identified two issues:
Please find the examples of the two issues in the notebook: https://gist.github.com/elanmart/9ab0ba21f282f6b24d972cbfb76b4578 Hope this is helpful |
Hi @elanmart, Thank you for taking the time to share this with me! Regarding what you identified:
|
Thanks for the answer! I was wondering how I can inspect the models, The following model @mcx.model
def example_2_mcx_v1():
offset <~ dist.Uniform(0, 5)
low = 12 - offset
outcome <~ dist.Uniform(low, 12)
return outcome is transformed into def example_2_mcx_v1_sample_forward(rng_key):
offset = dist.Uniform(0, 5).sample(rng_key)
low = offset - 12
outcome = dist.Uniform(low, 12).sample(rng_key)
forward_samples = {'offset': offset, 'outcome': outcome}
return forward_samples Notice how
became
EDIT The issue is not limited to constants. The arguments in subtraction are switched to match the order in which they were defined,
becomes
and so the model here @mcx.model
def example():
A <~ dist.Normal(0, 1)
B <~ dist.Normal(0, 2)
μ = B - A
Y <~ dist.Normal(μ, 1)
return Y becomes
|
Ah, and also regarding point 1. (same rng_key used many times): |
That's strange regarding Unfortunately no workaround for the Now I see how convenient compiling to a python function is for debugging 😄 Thank you for dealing with the teething problems here, it is really helpful for us. |
OK, so my Do you want me to close this ticket and open a clean one for
No worries, I would love to understand the compiler a bit better to be able to debug similar issues myself. |
No worries, you're really helpful :)
Yes please! Leave this one open until we solve the issue completely.
So that would be the quick and dirty solution. I think that I might instead generate as many keys as needed at the beginning of the function.
Well now you know that you can at least print the code generated by the compiler. It's a good start point. |
Actually the general principle with stay exactly the same. |
Thank you @tblazina ! This looks extremely useful, will go through it over the weekend! |
Just to let you know, I'll make some time to work on this and the other issue one my NUTS PR is merged on BlackJAX (which means MCX will support NUTS). How is the implementation on SR going? |
Thanks for the update! Looking forward to the NUTS sampler as well. I've decided to first go through the theory, and then make a second pass implementing the examples. I've just finished the book, so I'm going back to the code, which hopefully should go faster now. There were a few places in the book where some advanced STAN featuers were used. |
Great! If you remember which ones don't hesitate to open issues now. |
While going through Statistical Rethinking I wanted to execute a prior-predictive simulation, but the results did not match the textbook example, see below.
What's more, I played with some other synthetic examples and they also give unintuitive results, see further down.
Examples
Example from the rethinking
Code
Result
Expected
Synthetic example 1
In this example I sample an
offset
fromUniform(0, 1)
.Then I sample from
Uniform(12 - offset, 12 + offset)
So I expect my samples to be distributed in range
[11, 13]
But I get samples in range
[-15, 15]
Code
Result
Synthetic example 2
This is the same example as above, but
center
variable is passed as argument, not hardcoded, and results are different (although still not in range[11, 13]
Code
Result
Expectation
For the examples
1
and2
, here's what I'd expect to get:Environment
The text was updated successfully, but these errors were encountered: