"docs/source/en/testing.md" did not exist on "185876392c0dcd4c4bb02f2750822144a3bee545"
Unverified Commit 1530384e authored by joerenner's avatar joerenner Committed by GitHub
Browse files

changing find_batch_size to work with tokenizer outputs (#11890)



* changing find_batch_size to work with tokenizer outputs

trainer_pt_utils.find_batch_size does not recognize the batch size of BatchEncoding objects. This can cause an error when a trainer relies on find_batch_size to report the number of observed examples in the evaluation loop.

* Trigger CI
Co-authored-by: default avatarjrenner <joseph.renner@inria.fr>
parent d5a72b6e
...@@ -112,7 +112,7 @@ def find_batch_size(tensors): ...@@ -112,7 +112,7 @@ def find_batch_size(tensors):
result = find_batch_size(t) result = find_batch_size(t)
if result is not None: if result is not None:
return result return result
elif isinstance(tensors, dict): elif isinstance(tensors, (dict, BatchEncoding)):
for key, value in tensors.items(): for key, value in tensors.items():
result = find_batch_size(value) result = find_batch_size(value)
if result is not None: if result is not None:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment