-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature request]: EMCEE integration with inference_job.py #365
Labels
batch
Relating to batch processing.
high priority
High priority.
inference
Concerns the parameter inference framework.
Comments
Including capability to run subpop specific configs would also be tied to this, but can table this for now if that's outside scope of this |
TimothyWillard
added
r-inference
Relating to the R inference package.
batch
Relating to batch processing.
high priority
High priority.
labels
Oct 28, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 4, 2024
Just rebased `dev` into `GH-365/emcee-submit` which now contains black formatting for python.
TimothyWillard
added
inference
Concerns the parameter inference framework.
and removed
r-inference
Relating to the R inference package.
labels
Nov 5, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 8, 2024
Just rebased `dev` into `GH-365/emcee-submit` which now contains black formatting for python.
TimothyWillard
added a commit
that referenced
this issue
Nov 14, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 14, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 14, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 14, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 14, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 15, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 15, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 25, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 25, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 25, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 25, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 25, 2024
TimothyWillard
added a commit
that referenced
this issue
Nov 25, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
batch
Relating to batch processing.
high priority
High priority.
inference
Concerns the parameter inference framework.
Label
batch, inference, meta/workflow
Priority Label
high priority
Is your feature request related to a problem? Please describe.
One needs to write it's own batch script (see examples in Flu_USA) to run emcee.
Is your feature request related to a new application, scenario round, pathogen? Please describe.
No response
Describe the solution you'd like
We run job like this (Submitting A Batch Inference Job To Slurm heading in https://iddynamics.gitbook.io/flepimop/how-to-run/advanced-run-guides/running-on-a-hpc-with-slurm) using inference_job.py, which is very convenient. This scripts, which we could pull into gempyor so it has access to memory footprint and test runs, allows to run local, slurm or aws jobs. When this script detects method: emcee in the inference config (see https://iddynamics.gitbook.io/flepimop/model-inference/inference-with-emcee), it should build and run a slurm file like this one:
with rules like this:
The text was updated successfully, but these errors were encountered: