Unverified Commit 71963a66 authored by Daniele Sartiano's avatar Daniele Sartiano Committed by GitHub
Browse files

fix typo in modeling_encoder_decoder.py (#9297)



* Update modeling_encoder_decoder.py

Fixed typo.

* typo
Co-authored-by: default avatarSuraj Patil <surajp815@gmail.com>
parent f3a3b91d
......@@ -30,7 +30,7 @@ logger = logging.get_logger(__name__)
_CONFIG_FOR_DOC = "EncoderDecoderConfig"
ENCODER_DECODER_START_DOCSTRING = r"""
This class can be used to initialize a sequence-tsequencece model with any pretrained autoencoding model as the
This class can be used to initialize a sequence-to-sequence model with any pretrained autoencoding model as the
encoder and any pretrained autoregressive model as the decoder. The encoder is loaded via
:meth:`~transformers.AutoModel.from_pretrained` function and the decoder is loaded via
:meth:`~transformers.AutoModelForCausalLM.from_pretrained` function. Cross-attention layers are automatically added
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment