Unverified Commit cd5565be authored by Stas Bekman's avatar Stas Bekman Committed by GitHub
Browse files

fix the backward for deepspeed (#9705)

parent 538245b0
......@@ -1282,8 +1282,7 @@ class Trainer:
with amp.scale_loss(loss, self.optimizer) as scaled_loss:
scaled_loss.backward()
elif self.deepspeed:
# calling on DS engine (model_wrapped == DDP(Deepspeed(PretrainedModule)))
self.model_wrapped.module.backward(loss)
self.deepspeed.backward(loss)
else:
loss.backward()
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment