"vscode:/vscode.git/clone" did not exist on "a5a06a851e1da79138e53978aa079a093f243dde"
Unverified Commit 0a757176 authored by Steven Liu's avatar Steven Liu Committed by GitHub
Browse files

Fix task guide formatting (#21409)

fix formatting
parent a6d8a149
...@@ -193,6 +193,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when ...@@ -193,6 +193,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load Wav2Vec2 with [`AutoModelForAudioClassification`] along with the number of expected labels, and the label mappings: You're ready to start training your model now! Load Wav2Vec2 with [`AutoModelForAudioClassification`] along with the number of expected labels, and the label mappings:
```py ```py
......
...@@ -209,6 +209,7 @@ Use the end-of-sequence token as the padding token and set `mlm=False`. This wil ...@@ -209,6 +209,7 @@ Use the end-of-sequence token as the padding token and set `mlm=False`. This wil
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the [basic tutorial](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the [basic tutorial](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load DistilGPT2 with [`AutoModelForCausalLM`]: You're ready to start training your model now! Load DistilGPT2 with [`AutoModelForCausalLM`]:
```py ```py
......
...@@ -203,6 +203,7 @@ Use the end-of-sequence token as the padding token and specify `mlm_probability` ...@@ -203,6 +203,7 @@ Use the end-of-sequence token as the padding token and specify `mlm_probability`
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load DistilRoBERTa with [`AutoModelForMaskedLM`]: You're ready to start training your model now! Load DistilRoBERTa with [`AutoModelForMaskedLM`]:
```py ```py
......
...@@ -241,6 +241,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when ...@@ -241,6 +241,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load BERT with [`AutoModelForMultipleChoice`]: You're ready to start training your model now! Load BERT with [`AutoModelForMultipleChoice`]:
```py ```py
......
...@@ -196,6 +196,7 @@ Now create a batch of examples using [`DefaultDataCollator`]. Unlike other data ...@@ -196,6 +196,7 @@ Now create a batch of examples using [`DefaultDataCollator`]. Unlike other data
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load DistilBERT with [`AutoModelForQuestionAnswering`]: You're ready to start training your model now! Load DistilBERT with [`AutoModelForQuestionAnswering`]:
```py ```py
......
...@@ -155,6 +155,7 @@ Before you start training your model, create a map of the expected ids to their ...@@ -155,6 +155,7 @@ Before you start training your model, create a map of the expected ids to their
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load DistilBERT with [`AutoModelForSequenceClassification`] along with the number of expected labels, and the label mappings: You're ready to start training your model now! Load DistilBERT with [`AutoModelForSequenceClassification`] along with the number of expected labels, and the label mappings:
```py ```py
......
...@@ -176,6 +176,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when ...@@ -176,6 +176,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load T5 with [`AutoModelForSeq2SeqLM`]: You're ready to start training your model now! Load T5 with [`AutoModelForSeq2SeqLM`]:
```py ```py
......
...@@ -261,6 +261,7 @@ Before you start training your model, create a map of the expected ids to their ...@@ -261,6 +261,7 @@ Before you start training your model, create a map of the expected ids to their
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load DistilBERT with [`AutoModelForTokenClassification`] along with the number of expected labels, and the label mappings: You're ready to start training your model now! Load DistilBERT with [`AutoModelForTokenClassification`] along with the number of expected labels, and the label mappings:
```py ```py
......
...@@ -185,6 +185,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when ...@@ -185,6 +185,7 @@ Your `compute_metrics` function is ready to go now, and you'll return to it when
If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)! If you aren't familiar with finetuning a model with the [`Trainer`], take a look at the basic tutorial [here](../training#train-with-pytorch-trainer)!
</Tip> </Tip>
You're ready to start training your model now! Load T5 with [`AutoModelForSeq2SeqLM`]: You're ready to start training your model now! Load T5 with [`AutoModelForSeq2SeqLM`]:
```py ```py
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment