"git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "0d97ba8a9846b4856f8920bf0c955dbce91e8b4f"
Unverified Commit 4334095c authored by Mishig Davaadorj's avatar Mishig Davaadorj Committed by GitHub
Browse files

Fix typo (#14044)

parent 37c5759c
...@@ -623,7 +623,7 @@ SPEECH_TO_TEXT_INPUTS_DOCSTRING = r""" ...@@ -623,7 +623,7 @@ SPEECH_TO_TEXT_INPUTS_DOCSTRING = r"""
:obj:`past_key_values`). :obj:`past_key_values`).
decoder_attention_mask (:obj:`torch.LongTensor` of shape :obj:`(batch_size, target_sequence_length)`, `optional`): decoder_attention_mask (:obj:`torch.LongTensor` of shape :obj:`(batch_size, target_sequence_length)`, `optional`):
Default behavior: generate a tensor that ignores pad tokens in :obj:`decoder_input_ids`. Causal mask will Default behavior: generate a tensor that ignores pad tokens in :obj:`decoder_input_ids`. Causal mask will
also be used by default. <<<<<<< HEAD also be used by default.
If you want to change padding behavior, you should read If you want to change padding behavior, you should read
:func:`modeling_speech_to_text._prepare_decoder_inputs` and modify to your needs. See diagram 1 in `the :func:`modeling_speech_to_text._prepare_decoder_inputs` and modify to your needs. See diagram 1 in `the
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment