"vscode:/vscode.git/clone" did not exist on "5da3db3fd5c070107df717a13382ccf1fe1efbe4"
Unverified Commit 1530384e authored by joerenner's avatar joerenner Committed by GitHub
Browse files

changing find_batch_size to work with tokenizer outputs (#11890)



* changing find_batch_size to work with tokenizer outputs

trainer_pt_utils.find_batch_size does not recognize the batch size of BatchEncoding objects. This can cause an error when a trainer relies on find_batch_size to report the number of observed examples in the evaluation loop.

* Trigger CI
Co-authored-by: default avatarjrenner <joseph.renner@inria.fr>
parent d5a72b6e
......@@ -112,7 +112,7 @@ def find_batch_size(tensors):
result = find_batch_size(t)
if result is not None:
return result
elif isinstance(tensors, dict):
elif isinstance(tensors, (dict, BatchEncoding)):
for key, value in tensors.items():
result = find_batch_size(value)
if result is not None:
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment