Unverified Commit c385de24 authored by lishukan's avatar lishukan Committed by GitHub
Browse files

[TYPO] fix typo/format in quicktour.md (#25519)



* fix_all_language_quicktour

* give up ! before bash command

---------
Co-authored-by: default avatarlishukan <lishukan@dxy.cn>
parent eec5841e
......@@ -68,11 +68,13 @@ Installieren Sie die folgenden Abhängigkeiten, falls Sie dies nicht bereits get
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -226,6 +228,7 @@ Genau wie die [`pipeline`] akzeptiert der Tokenizer eine Liste von Eingaben. Dar
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -237,6 +240,7 @@ Genau wie die [`pipeline`] akzeptiert der Tokenizer eine Liste von Eingaben. Dar
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -375,6 +379,7 @@ Ein besonders cooles 🤗 Transformers-Feature ist die Möglichkeit, ein Modell
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -383,6 +388,7 @@ Ein besonders cooles 🤗 Transformers-Feature ist die Möglichkeit, ein Modell
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -30,11 +30,13 @@ You'll also need to install your preferred machine learning framework:
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -208,6 +210,7 @@ A tokenizer can also accept a list of inputs, and pad and truncate the text to r
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -219,6 +222,7 @@ A tokenizer can also accept a list of inputs, and pad and truncate the text to r
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -352,6 +356,7 @@ One particularly cool 🤗 Transformers feature is the ability to save a model a
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -360,6 +365,7 @@ One particularly cool 🤗 Transformers feature is the ability to save a model a
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -68,11 +68,13 @@ Instala las siguientes dependencias si aún no lo has hecho:
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -224,6 +226,7 @@ Como con el [`pipeline`], el tokenizador aceptará una lista de inputs. Además,
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -235,6 +238,7 @@ Como con el [`pipeline`], el tokenizador aceptará una lista de inputs. Además,
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -377,6 +381,7 @@ Una característica particularmente interesante de 🤗 Transformers es la habil
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -385,6 +390,7 @@ Una característica particularmente interesante de 🤗 Transformers es la habil
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -30,11 +30,13 @@ Vous aurez aussi besoin d'installer votre bibliothèque d'apprentissage profond
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -203,6 +205,7 @@ Un tokenizer peut également accepter une liste de textes, et remplir et tronque
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -214,6 +217,7 @@ Un tokenizer peut également accepter une liste de textes, et remplir et tronque
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -346,6 +350,7 @@ Une fonctionnalité particulièrement cool 🤗 Transformers est la possibilité
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -354,6 +359,7 @@ Une fonctionnalité particulièrement cool 🤗 Transformers est la possibilité
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -68,11 +68,13 @@ Installa le seguenti dipendenze se non lo hai già fatto:
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -379,6 +381,7 @@ Una caratteristica particolarmente interessante di 🤗 Transformers è la sua a
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -387,6 +390,7 @@ Una caratteristica particolarmente interessante di 🤗 Transformers è la sua a
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -30,11 +30,13 @@ rendered properly in your Markdown viewer.
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -210,6 +212,7 @@ label: NEGATIVE, with score: 0.5309
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -221,6 +224,7 @@ label: NEGATIVE, with score: 0.5309
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -353,6 +357,7 @@ tensor([[0.0021, 0.0018, 0.0115, 0.2121, 0.7725],
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -361,6 +366,7 @@ tensor([[0.0021, 0.0018, 0.0115, 0.2121, 0.7725],
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -228,6 +228,7 @@ Assim como o [`pipeline`], o tokenizer aceitará uma lista de entradas. Além di
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 transformers library.", "We hope you don't hate it."],
......@@ -239,6 +240,7 @@ Assim como o [`pipeline`], o tokenizer aceitará uma lista de entradas. Além di
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -377,6 +379,7 @@ Um recurso particularmente interessante dos 🤗 Transformers é a capacidade de
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -385,6 +388,7 @@ Um recurso particularmente interessante dos 🤗 Transformers é a capacidade de
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
......@@ -30,11 +30,13 @@ rendered properly in your Markdown viewer.
<frameworkcontent>
<pt>
```bash
pip install torch
```
</pt>
<tf>
```bash
pip install tensorflow
```
......@@ -203,6 +205,7 @@ label: NEGATIVE, with score: 0.5309
<frameworkcontent>
<pt>
```py
>>> pt_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -214,6 +217,7 @@ label: NEGATIVE, with score: 0.5309
```
</pt>
<tf>
```py
>>> tf_batch = tokenizer(
... ["We are very happy to show you the 🤗 Transformers library.", "We hope you don't hate it."],
......@@ -347,6 +351,7 @@ tensor([[0.0021, 0.0018, 0.0115, 0.2121, 0.7725],
<frameworkcontent>
<pt>
```py
>>> from transformers import AutoModel
......@@ -355,6 +360,7 @@ tensor([[0.0021, 0.0018, 0.0115, 0.2121, 0.7725],
```
</pt>
<tf>
```py
>>> from transformers import TFAutoModel
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment