Unverified Commit 8ab5e470 authored by mcarilli's avatar mcarilli Committed by GitHub
Browse files

Update README.md

parent a730f38f
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
[torch.nn.parallel.DistributedDataParallel](https://pytorch.org/docs/stable/nn.html#distributeddataparallel) [torch.nn.parallel.DistributedDataParallel](https://pytorch.org/docs/stable/nn.html#distributeddataparallel)
and the Pytorch multiprocess launcher script, and the Pytorch multiprocess launcher script,
[torch.distributed.launch](https://pytorch.org/docs/master/distributed.html#launch-utility). [torch.distributed.launch](https://pytorch.org/docs/master/distributed.html#launch-utility).
The use of `Amp` with distributed does not need to change from ordinary The use of `Amp` with DistributedDataParallel does not need to change from ordinary
single-process use. The only gotcha is that wrapping your model with `DistributedDataParallel` must single-process use. The only gotcha is that wrapping your model with `DistributedDataParallel` must
come after the call to `amp.initialize`. Test via come after the call to `amp.initialize`. Test via
```bash ```bash
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment