You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#137 sadly never worked to fix the issue for us
If you try running the sbatch command with 2 (or greater) cpu's specified (assume your default is 1 cpu) both threads should be 2.
We have been setting both cpus_per_task and threads in the slurm config file (matching the totals) for the moment as a workaround. It's a bit of a duplication but can't find a reliable way to link the totals together. As long as the cpus_per_task are the same or greater than threads than the threads command works for the correct total. You need to specify both as snakemake rules run on threads, the dry run totals are also incorrect so have to check the logs or slurm logs for correct threads/totals.
If you run from head node the threads will work correctly but we don't like doing this and prefer using sbatch.
@blaiseli I think your --profile may be interfering with the plugin. Did you try running without it? Or at least removing everything above the max-jobs-per-second: "10" line.
Software Versions
Describe the bug
In a rule having
threads
set to 2, a shell command built to display{threads}
reports only 1 thread when snakemake is run through slurm using sbatch.Minimal example
Here is a short example meant to compare the above with what happens when setting
ncpus_per_task
to 2 inresources
.I run it through sbatch using the following script:
Running it:
Looking at the output:
If I run the workflow without sbatch and slurm, both output files contain "2".
Additional context
This looks like something similar to what is described here: #113 (comment)
However, if I understand correctly, this aspect of #113 is supposed to be solved by #137 which is included in 0.10.0
In case this is relevant, here is the config.yaml of the slurm profile given to
--profile
:The text was updated successfully, but these errors were encountered: