"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "b2724d7b4ce07b615bfe6c7b6ce23df04249f9c3"
Fix Padded Batch Error 12282 (#12487)
This fixes the padded batch [issue](https://github.com/huggingface/transformers/issues/12282). The error was generated due to the maximum sequence length of the attention mask not matching the padded sequence length of the hidden_states. `np.allclose` now passes with a 1e-2 absolute tolerance. This change fixes
Showing
Please register or sign in to comment