Unverified Commit 8ac29fe0 authored by amyeroberts's avatar amyeroberts Committed by GitHub
Browse files

Fix doc links (#22274)

parent da005253
...@@ -403,9 +403,9 @@ Configure the model for training with `compile()`: ...@@ -403,9 +403,9 @@ Configure the model for training with `compile()`:
>>> model.compile(optimizer=optimizer, loss=loss) >>> model.compile(optimizer=optimizer, loss=loss)
``` ```
To compute the accuracy from the predictions and push your model to the 🤗 Hub, use [Keras callbacks](./main_classes/keras_callbacks). To compute the accuracy from the predictions and push your model to the 🤗 Hub, use [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [KerasMetricCallback](./main_classes/keras_callbacks#transformers.KerasMetricCallback), Pass your `compute_metrics` function to [KerasMetricCallback](../main_classes/keras_callbacks#transformers.KerasMetricCallback),
and use the [PushToHubCallback](./main_classes/keras_callbacks#transformers.PushToHubCallback) to upload the model: and use the [PushToHubCallback](../main_classes/keras_callbacks#transformers.PushToHubCallback) to upload the model:
```py ```py
>>> from transformers.keras_callbacks import KerasMetricCallback, PushToHubCallback >>> from transformers.keras_callbacks import KerasMetricCallback, PushToHubCallback
......
...@@ -341,7 +341,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo ...@@ -341,7 +341,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo
>>> model.compile(optimizer=optimizer) >>> model.compile(optimizer=optimizer)
``` ```
The last two things to setup before you start training is to compute the accuracy from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](./main_classes/keras_callbacks). The last two things to setup before you start training is to compute the accuracy from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]: Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]:
......
...@@ -412,7 +412,7 @@ Convert your datasets to the `tf.data.Dataset` format using the [`~datasets.Data ...@@ -412,7 +412,7 @@ Convert your datasets to the `tf.data.Dataset` format using the [`~datasets.Data
... ) ... )
``` ```
To compute the accuracy from the predictions and push your model to the 🤗 Hub, use [Keras callbacks](./main_classes/keras_callbacks). To compute the accuracy from the predictions and push your model to the 🤗 Hub, use [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [`KerasMetricCallback`], Pass your `compute_metrics` function to [`KerasMetricCallback`],
and use the [`PushToHubCallback`] to upload the model: and use the [`PushToHubCallback`] to upload the model:
......
...@@ -267,7 +267,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo ...@@ -267,7 +267,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo
>>> model.compile(optimizer=optimizer) >>> model.compile(optimizer=optimizer)
``` ```
The last two things to setup before you start training is to compute the accuracy from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](./main_classes/keras_callbacks). The last two things to setup before you start training is to compute the accuracy from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]: Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]:
......
...@@ -275,7 +275,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo ...@@ -275,7 +275,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo
>>> model.compile(optimizer=optimizer) >>> model.compile(optimizer=optimizer)
``` ```
The last two things to setup before you start training is to compute the ROUGE score from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](./main_classes/keras_callbacks). The last two things to setup before you start training is to compute the ROUGE score from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]: Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]:
...@@ -354,7 +354,7 @@ Tokenize the text and return the `input_ids` as PyTorch tensors: ...@@ -354,7 +354,7 @@ Tokenize the text and return the `input_ids` as PyTorch tensors:
>>> inputs = tokenizer(text, return_tensors="pt").input_ids >>> inputs = tokenizer(text, return_tensors="pt").input_ids
``` ```
Use the [`~transformers.generation_utils.GenerationMixin.generate`] method to create the summarization. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](./main_classes/text_generation) API. Use the [`~transformers.generation_utils.GenerationMixin.generate`] method to create the summarization. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](../main_classes/text_generation) API.
```py ```py
>>> from transformers import AutoModelForSeq2SeqLM >>> from transformers import AutoModelForSeq2SeqLM
...@@ -380,7 +380,7 @@ Tokenize the text and return the `input_ids` as TensorFlow tensors: ...@@ -380,7 +380,7 @@ Tokenize the text and return the `input_ids` as TensorFlow tensors:
>>> inputs = tokenizer(text, return_tensors="tf").input_ids >>> inputs = tokenizer(text, return_tensors="tf").input_ids
``` ```
Use the [`~transformers.generation_tf_utils.TFGenerationMixin.generate`] method to create the summarization. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](./main_classes/text_generation) API. Use the [`~transformers.generation_tf_utils.TFGenerationMixin.generate`] method to create the summarization. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](../main_classes/text_generation) API.
```py ```py
>>> from transformers import TFAutoModelForSeq2SeqLM >>> from transformers import TFAutoModelForSeq2SeqLM
......
...@@ -369,7 +369,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo ...@@ -369,7 +369,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo
>>> model.compile(optimizer=optimizer) >>> model.compile(optimizer=optimizer)
``` ```
The last two things to setup before you start training is to compute the seqeval scores from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](./main_classes/keras_callbacks). The last two things to setup before you start training is to compute the seqeval scores from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]: Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]:
......
...@@ -284,7 +284,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo ...@@ -284,7 +284,7 @@ Configure the model for training with [`compile`](https://keras.io/api/models/mo
>>> model.compile(optimizer=optimizer) >>> model.compile(optimizer=optimizer)
``` ```
The last two things to setup before you start training is to compute the SacreBLEU metric from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](./main_classes/keras_callbacks). The last two things to setup before you start training is to compute the SacreBLEU metric from the predictions, and provide a way to push your model to the Hub. Both are done by using [Keras callbacks](../main_classes/keras_callbacks).
Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]: Pass your `compute_metrics` function to [`~transformers.KerasMetricCallback`]:
...@@ -362,7 +362,7 @@ Tokenize the text and return the `input_ids` as PyTorch tensors: ...@@ -362,7 +362,7 @@ Tokenize the text and return the `input_ids` as PyTorch tensors:
>>> inputs = tokenizer(text, return_tensors="pt").input_ids >>> inputs = tokenizer(text, return_tensors="pt").input_ids
``` ```
Use the [`~transformers.generation_utils.GenerationMixin.generate`] method to create the translation. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](./main_classes/text_generation) API. Use the [`~transformers.generation_utils.GenerationMixin.generate`] method to create the translation. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](../main_classes/text_generation) API.
```py ```py
>>> from transformers import AutoModelForSeq2SeqLM >>> from transformers import AutoModelForSeq2SeqLM
...@@ -388,7 +388,7 @@ Tokenize the text and return the `input_ids` as TensorFlow tensors: ...@@ -388,7 +388,7 @@ Tokenize the text and return the `input_ids` as TensorFlow tensors:
>>> inputs = tokenizer(text, return_tensors="tf").input_ids >>> inputs = tokenizer(text, return_tensors="tf").input_ids
``` ```
Use the [`~transformers.generation_tf_utils.TFGenerationMixin.generate`] method to create the translation. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](./main_classes/text_generation) API. Use the [`~transformers.generation_tf_utils.TFGenerationMixin.generate`] method to create the translation. For more details about the different text generation strategies and parameters for controlling generation, check out the [Text Generation](../main_classes/text_generation) API.
```py ```py
>>> from transformers import TFAutoModelForSeq2SeqLM >>> from transformers import TFAutoModelForSeq2SeqLM
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment