-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added ability to hunt for replay, as well as some more minor changes. #15
base: callback_and_bugfix
Are you sure you want to change the base?
Conversation
… and a new example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good overall. My main suggestion is to replace config = Dict(:key1 => value)
with a struct GibbsSamplerParameters
(or something like that). We can type the struct and provide default values for less commonly used arguments; this way we catch mismatched types early to prevent weird bugs / undefined behavior.
extra_split_merge_moves::Int64, | ||
split_merge_window::Float64, | ||
save_every::Int64; | ||
config::Dict; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason why the arguments were replaced by config
?
Two alternatives that might be safer:
- Use keyword arguments, e.g.,
num_anneals = 10
. - Create a struct
GibbsParameters(num_anneals, samples_per_anneal, <etc>)
and pass this as an argument.
Option 2 is essentially what you have here, but it's typed. Typing helps catch bugs early / avoid unexpected behavior when, for example, someone passes an array instead of a scalar for max_temperature
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Aha yeah this does seem non-optimal, basically I added some extra variable that I needed to pass deep inside some nesting of functions and got annoyed having to go and change the argument list for every function on the way down repeatedly, so shoved everything in the config. Very open to the idea that this is not the ideal way to do things.
More than happy to have a go changing
Differs from easy_sample.jl in that it uses masks and masked gibbs sampling. | ||
Used for doing cross-validation on speckled hold-out data. | ||
""" | ||
function easy_sample_masked!( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great.
@@ -31,7 +32,7 @@ function masked_gibbs!( | |||
|
|||
# Create inverted masks to compute train log likelihood. | |||
inv_masks = compute_complementary_masks( | |||
masks, num_neurons(model), model.max_time) | |||
masks, num_neurons(model), model.max_time+0.000000001) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is the purpose of this?
Can we define a keyword argument max_time_tolerance = 1e-8
or something?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think there was some strange behaviour where it would just slightly get confused about a spike that was on the edge of the max time and the whole thing would explode. This fixed it (though is admittedly janky)
In this version of the code the user can train the model on some neural data, extract the resulting sequence parameters, fix a subset of those, then use a model with fixed sequence parameters to go looking for replay of those specific sequences in new data.
This is demonstrated in the jupyter notebook 'songbird-sacred.ipynb'. Sacred is the name we've been using for fixed sequences. To do this there is an additional function - sanctify_model - that takes the parameters describing a set of sequences and uses them to fix them in a particular model.
Additionally there are some smaller changes: