Commit f8557569 authored by Michael Carilli's avatar Michael Carilli
Browse files

Docstring for multiple losses

parent f29b3f8d
...@@ -94,7 +94,7 @@ receive gradients. ...@@ -94,7 +94,7 @@ receive gradients.
If, for a given backward pass, there's only one optimizer whose params are about to receive gradients, If, for a given backward pass, there's only one optimizer whose params are about to receive gradients,
you may pass that optimizer directly to ``amp.scale_loss``. Otherwise, you must pass the you may pass that optimizer directly to ``amp.scale_loss``. Otherwise, you must pass the
list of optimizers whose params are about to receive gradients:: list of optimizers whose params are about to receive gradients. Example with 3 losses and 2 optimizers::
# loss0 accumulates gradients only into params owned by optim0: # loss0 accumulates gradients only into params owned by optim0:
with amp.scale_loss(loss0, optim0) as scaled_loss: with amp.scale_loss(loss0, optim0) as scaled_loss:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment