Launch experiments locally or on a cluster running SLURM in a single file.
The experiment_launcher
package provides a way to run multiple experiments using
SLURM or Joblib,
with minimum effort - you just have to set the "local" parameter to True for Joblib,
and to False for SLURM.
You can do a minimal installation of experiment_launcher
with:
pip3 install -e .
-
Create in your own project two files test.py and launch_test.py
-
Create an experiment file as in test.py
- This file consists of two base functions:
experiment
,parse_args
; and theif __name__ == '__main__'
block - The function
experiment
is the core of your experiment- It takes as arguments your experiment settings (e.g., the number of layers in a neural network, the learning rate, ...)
- The arguments need to be assigned a default value in the function definition
- The arguments
seed
andresults_dir
must always be included - By default,
results_dir
is the/path_to_your_sub_experiment
- The function
parse_args
includes a CLIArgumentParser
- In this function you should define the command line arguments
- These arguments must be the same as the ones define in the function
experiment
- You don't need to define the arguments
seed
andresults_dir
- they are defined inadd_launcher_base_args
- In
if __name__ == '__main__'
simply include:if __name__ == '__main__': args = parse_args() run_experiment(experiment, args)
- This file consists of two base functions:
-
Create a launcher file as in launch_test.py
- Specify the running configurations by calling a
Launcher
constructor:n_exps
is the number of random seeds for each single experiment configuration- If
joblib_n_jobs > 0
, then each node will runjoblib_n_jobs
experiments possibly in parallel. E.g., ifjoblib_n_jobs
is3
, then3
jobs will run in parallel, even ifn_cores
is1
. For better performance, one should specifyn_cores >= joblib_n_jobs * 1
- Create a single experiment configuration
- Use
launcher.add_default_params
to add parameters shared across configurations (e.g., the dataset) - Use
launcher.add_experiment
to create a particular configuration (e.g., different learning rates)
- Use
- Specify the running configurations by calling a
-
To run the experiment call
cd examples python launch_test.py
-
Log files will be placed in
./logs
if running locally/work/scratch/USERNAME
(the default for theLichtenberg-Hochleistungsrechner of the TU Darmstadt
)
- The seeds are created sequentially from
0
ton_exps