"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "48c22691e3512eaf0bbab761cda67ce09b6348ea"
Unverified Commit 6d49b9dc authored by Sam Denton's avatar Sam Denton Committed by GitHub
Browse files

Fix eval accumulation when `accelerate` > 0.20.3 (#26060)

As mentioned in: https://github.com/huggingface/transformers/issues/25641

Eval accumulation will never happen with `accelerate > 0.20.3`, so this change ensures that `sync_gradients` is ignored if accelerate is > 0.20.3
parent d7bd325b
...@@ -3254,7 +3254,7 @@ class Trainer: ...@@ -3254,7 +3254,7 @@ class Trainer:
if ( if (
args.eval_accumulation_steps is not None args.eval_accumulation_steps is not None
and (step + 1) % args.eval_accumulation_steps == 0 and (step + 1) % args.eval_accumulation_steps == 0
and self.accelerator.sync_gradients and (self.accelerator.sync_gradients or version.parse(accelerate_version) > version.parse("0.20.3"))
): ):
if losses_host is not None: if losses_host is not None:
losses = nested_numpify(losses_host) losses = nested_numpify(losses_host)
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment