Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request]: EMCEE integration with inference_job.py #365

Open
jcblemai opened this issue Oct 28, 2024 · 1 comment
Open

[Feature request]: EMCEE integration with inference_job.py #365

jcblemai opened this issue Oct 28, 2024 · 1 comment
Assignees
Labels
batch Relating to batch processing. high priority High priority. inference Concerns the parameter inference framework.

Comments

@jcblemai
Copy link
Collaborator

Label

batch, inference, meta/workflow

Priority Label

high priority

Is your feature request related to a problem? Please describe.

One needs to write it's own batch script (see examples in Flu_USA) to run emcee.

Is your feature request related to a new application, scenario round, pathogen? Please describe.

No response

Describe the solution you'd like

We run job like this (Submitting A Batch Inference Job To Slurm heading in https://iddynamics.gitbook.io/flepimop/how-to-run/advanced-run-guides/running-on-a-hpc-with-slurm) using inference_job.py, which is very convenient. This scripts, which we could pull into gempyor so it has access to memory footprint and test runs, allows to run local, slurm or aws jobs. When this script detects method: emcee in the inference config (see https://iddynamics.gitbook.io/flepimop/model-inference/inference-with-emcee), it should build and run a slurm file like this one:

#!/bin/bash
#SBATCH -N 1
#SBATCH -n 1
#SBATCH -p general
#SBATCH --mem=100g
#SBATCH -c 48
#SBATCH -t 00-20:00:00
flepimop-calibrate -c config_rsvnet_2024_1_emcee.yml --nwalkers 100 --jobs 48 --niterations 500 --nsamples 100 > out_fit_rsv_emcee_1.out 2>&1

with rules like this:

@saraloo
Copy link
Contributor

saraloo commented Oct 28, 2024

Including capability to run subpop specific configs would also be tied to this, but can table this for now if that's outside scope of this
https://github.com/HopkinsIDD/RSV_USA/blob/main/SLURM_emcee_job_small_per_subpop.batch

@TimothyWillard TimothyWillard added r-inference Relating to the R inference package. batch Relating to batch processing. high priority High priority. labels Oct 28, 2024
TimothyWillard added a commit that referenced this issue Nov 4, 2024
Just rebased `dev` into `GH-365/emcee-submit` which now contains black
formatting for python.
@TimothyWillard TimothyWillard added inference Concerns the parameter inference framework. and removed r-inference Relating to the R inference package. labels Nov 5, 2024
TimothyWillard added a commit that referenced this issue Nov 8, 2024
Just rebased `dev` into `GH-365/emcee-submit` which now contains black
formatting for python.
@TimothyWillard TimothyWillard removed this from the Inference Rebuild milestone Nov 25, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
batch Relating to batch processing. high priority High priority. inference Concerns the parameter inference framework.
Projects
None yet
Development

When branches are created from issues, their pull requests are automatically linked.

3 participants