Skip to content

Commit

Permalink
docs feedback
Browse files Browse the repository at this point in the history
  • Loading branch information
cathalobrien committed Jan 21, 2025
1 parent d6a77ff commit b8be926
Showing 1 changed file with 4 additions and 3 deletions.
7 changes: 4 additions & 3 deletions docs/parallel.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,13 @@
####################

If the memory requirements of your model are too large to fit within a
single GPU, you run Anemoi-Inference in parallel across multiple GPUs.
single GPU, you can run Anemoi-Inference in parallel across multiple
GPUs.

Parallel inference requires SLURM to launch the parallel processes and
to determine information about your network environment. If SLURM is not
available to you, please create an issue on the Anemoi-Inference github
page.
page `here <https://github.com/ecmwf/anemoi-inference/issues>`_.

***************
Configuration
Expand Down Expand Up @@ -42,7 +43,7 @@ job across 4 GPUs.
#SBATCH --gpus-per-node=4
#SBATCH --cpus-per-task=8
#SBATCH --time=0:05:00
#SBATCH --output=outputs/paralell_inf.%j.out
#SBATCH --output=outputs/parallel_inf.%j.out
source /path/to/venv/bin/activate
srun anemoi-inference run parallel.yaml
Expand Down

0 comments on commit b8be926

Please sign in to comment.