>>> from transformers import AutoModelForSeq2SeqLM
>>> model = AutoModelForSeq2SeqLM.from_pretrained("t5-small")
```
</pt>
<tf>
Load T5 with [`TFAutoModelForSeq2SeqLM`]:
```py
>>> from transformers import TFAutoModelForSeq2SeqLM
>>> model = TFAutoModelForSeq2SeqLM.from_pretrained("t5-small")
```
</tf>
</frameworkcontent>
Use [`DataCollatorForSeq2Seq`] to create a batch of examples. It will also *dynamically pad* your text and labels to the length of the longest element in its batch, so they are a uniform length. While it is possible to pad your text in the `tokenizer` function by setting `padding=True`, dynamic padding is more efficient.
Use [`DataCollatorForSeq2Seq`] to create a batch of examples. It will also *dynamically pad* your text and labels to the length of the longest element in its batch, so they are a uniform length. While it is possible to pad your text in the `tokenizer` function by setting `padding=True`, dynamic padding is more efficient.
<frameworkcontent>
<frameworkcontent>
<pt>
<pt>
```py
```py
>>> from transformers import DataCollatorForSeq2Seq
>>> from transformers import DataCollatorForSeq2Seq
...
@@ -104,6 +126,7 @@ Use [`DataCollatorForSeq2Seq`] to create a batch of examples. It will also *dyna
...
@@ -104,6 +126,7 @@ Use [`DataCollatorForSeq2Seq`] to create a batch of examples. It will also *dyna
```
```
</pt>
</pt>
<tf>
<tf>
```py
```py
>>> from transformers import DataCollatorForSeq2Seq
>>> from transformers import DataCollatorForSeq2Seq
...
@@ -116,13 +139,6 @@ Use [`DataCollatorForSeq2Seq`] to create a batch of examples. It will also *dyna
...
@@ -116,13 +139,6 @@ Use [`DataCollatorForSeq2Seq`] to create a batch of examples. It will also *dyna
<frameworkcontent>
<frameworkcontent>
<pt>
<pt>
Load T5 with [`AutoModelForSeq2SeqLM`]:
```py
>>> from transformers import AutoModelForSeq2SeqLM, Seq2SeqTrainingArguments, Seq2SeqTrainer
>>> model = AutoModelForSeq2SeqLM.from_pretrained("t5-small")
```
<Tip>
<Tip>
...
@@ -137,6 +153,8 @@ At this point, only three steps remain:
...
@@ -137,6 +153,8 @@ At this point, only three steps remain:
3. Call [`~Trainer.train`] to fine-tune your model.
3. Call [`~Trainer.train`] to fine-tune your model.
```py
```py
>>> from transformers import Seq2SeqTrainingArguments, Seq2SeqTrainer
>>> training_args = Seq2SeqTrainingArguments(
>>> training_args = Seq2SeqTrainingArguments(
... output_dir="./results",
... output_dir="./results",
... evaluation_strategy="epoch",
... evaluation_strategy="epoch",
...
@@ -194,14 +212,6 @@ Set up an optimizer function, learning rate schedule, and some training hyperpar
...
@@ -194,14 +212,6 @@ Set up an optimizer function, learning rate schedule, and some training hyperpar