-
Notifications
You must be signed in to change notification settings - Fork 186
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error occurred during initialization of VM java/lang/NoClassDefFoundError: java/lang/Object #164
Labels
bug
Something isn't working
Comments
I've used the verbose flag
|
Relevant issue: #191 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Snakemake version
NOTE: I will go over why I'm not using the latest version in the additional context section
Wrapper version
Describe the bug
I'm getting the following error trying to use the
fastqc
wrapper:Logs
Minimal example
The
config.yaml
The Snakefile
Additional context
It was mentioned in the preamble to this issue that I should I try the newest version of Snakemake. I downloaded the newest version via:
But now when I try a dry-run I get a
Segmentation fault
Using the older version of Snakemake (for comparison)
Newest version of Snakemake
I guess other pertinent information is that I'm on an academic HPC with a SLURM scheduler.
The issue I see with me using the newest
Snakemake
version is the following:If I were to run an interactive job (to get more memory) (via
salloc --time=1:0:0 --mem=1000
) and then try to submit a job (which is a full pipeline consisting of many wrappers) (viabash -c "nohup snakemake --profile slurm --use-conda --jobs 500 &"
) it would only run jobs as long as the interactive job was set for.As I understand it,
Snakemake
needs to be run from the head node - it submit jobs to theSLURM
scheduler.Is it possible that
Snakemake
version5.23.0
is more memory intensive than5.8.1
? And if so does this preclude me from using it?The text was updated successfully, but these errors were encountered: