Commit 5cb0c807 authored by Raul Puri's avatar Raul Puri
Browse files

more documentation for batch sampler

parent 26d6aff4
...@@ -82,7 +82,10 @@ class DistributedBatchSampler(data.sampler.BatchSampler): ...@@ -82,7 +82,10 @@ class DistributedBatchSampler(data.sampler.BatchSampler):
(sequential, random, WeightedRandomSampler, etc.) with this batch (sequential, random, WeightedRandomSampler, etc.) with this batch
sampler. sampler.
The `interleave` argument specifies how to distribute a batch. The `interleave` argument specifies how to distribute a batch. A value
of True combined with the above random sampler is equivalent to pytorch's
torch.utils.data.distributed.DistributedSampler.
For the following batch [0,1,2,3,4,5,6,7] and data parallelism of 2 For the following batch [0,1,2,3,4,5,6,7] and data parallelism of 2
specifying True will result in the following samples for each gpu: specifying True will result in the following samples for each gpu:
GPU0: [0,2,4,6] GPU1: [1,3,5,7] GPU0: [0,2,4,6] GPU1: [1,3,5,7]
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment