"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "94306352f489c7c2a8dc18af89e2efe0a76a5159"
[LED] fix global_attention_mask not being passed for generation and docs...
[LED] fix global_attention_mask not being passed for generation and docs clarification about grad checkpointing (#17112) * [LED] fixed global_attention_mask not passed for generation + docs clarification for gradient checkpointing * LED docs clarification Co-authored-by:Patrick von Platen <patrick.v.platen@gmail.com> * [LED] gradient_checkpointing=True should be passed to TrainingArguments Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * [LED] docs: remove wrong word Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> * [LED] docs fix typo Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com> Co-authored-by:
Patrick von Platen <patrick.v.platen@gmail.com>
Showing
Please register or sign in to comment