Unverified Commit 6d49b9dc authored by Sam Denton's avatar Sam Denton Committed by GitHub
Browse files

Fix eval accumulation when `accelerate` > 0.20.3 (#26060)

As mentioned in: https://github.com/huggingface/transformers/issues/25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
parent d7bd325b
......@@ -3254,7 +3254,7 @@ class Trainer:
if (
args.eval_accumulation_steps is not None
and (step + 1) % args.eval_accumulation_steps == 0
and self.accelerator.sync_gradients
and (self.accelerator.sync_gradients or version.parse(accelerate_version) > version.parse("0.20.3"))
):
if losses_host is not None:
losses = nested_numpify(losses_host)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment