Unverified Commit 45840b51 authored by Shaoshuai Shi's avatar Shaoshuai Shi Committed by GitHub
Browse files

Support ditributed testing with torch.distributed.launch (#114)

* add dist_test.sh script for multi-gpu testing

* update README.md
parent 34076a44
......@@ -19,6 +19,12 @@ python test.py --cfg_file ${CONFIG_FILE} --batch_size ${BATCH_SIZE} --eval_all
```shell script
sh scripts/slurm_test_mgpu.sh ${PARTITION} ${NUM_GPUS} \
--cfg_file ${CONFIG_FILE} --batch_size ${BATCH_SIZE}
# or
sh scripts/dist_test.sh ${NUM_GPUS} \
--cfg_file ${CONFIG_FILE} --batch_size ${BATCH_SIZE}
```
......
#!/usr/bin/env bash
set -x
NGPUS=$1
PY_ARGS=${@:2}
python -m torch.distributed.launch --nproc_per_node=${NGPUS} test.py --launcher pytorch ${PY_ARGS}
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment