From 8500afd1767a9ac83976af6aef944cfa94fde2c8 Mon Sep 17 00:00:00 2001 From: Dave McKay Date: Tue, 21 May 2024 14:51:37 +0100 Subject: [PATCH] note on CUDA containers --- docs/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/README.md b/docs/README.md index 540a74b..f0027de 100644 --- a/docs/README.md +++ b/docs/README.md @@ -50,7 +50,7 @@ Instructions to recreate our implementation are linked below. 4. [> SiMLInt ML Training Implementation](./training_implementation.md) 5. [> SiMLInt Simulation](./inference.md) -SiMLInt Docker images have been built for [CPU](https://github.com/orgs/EPCCed/packages/container/package/simlint) and [GPU](https://github.com/orgs/EPCCed/packages/container/package/simlint-gpu). The CPU version can perform run BOUT++ Hasegawa-Wakatani simulations, generate ground-truth data, or run SiMLInt simulations with inference, while the GPU version is intended for use in training ML models. For Docker container usage instructions, click [here](docker-images.md). +SiMLInt Docker images have been built for [CPU](https://github.com/orgs/EPCCed/packages/container/package/simlint) and [GPU](https://github.com/orgs/EPCCed/packages/container/package/simlint-gpu). The CPU version can perform run BOUT++ Hasegawa-Wakatani simulations, generate ground-truth data, or run SiMLInt simulations with inference, while the GPU version is intended for use in training ML models (note: The [NVIDIA Container Toolkit](https://github.com/NVIDIA/nvidia-container-toolkit) is required to run CUDA images). For Docker container usage instructions, click [here](docker-images.md). ## References