Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
fc74c845
Unverified
Commit
fc74c845
authored
Dec 13, 2021
by
Lucien
Committed by
GitHub
Dec 13, 2021
Browse files
Swap TF and PT code inside two blocks (#14742)
parent
8362d07d
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
6 deletions
+6
-6
docs/source/quicktour.mdx
docs/source/quicktour.mdx
+6
-6
No files found.
docs/source/quicktour.mdx
View file @
fc74c845
...
@@ -334,21 +334,21 @@ PyTorch and TensorFlow: any model saved as before can be loaded back either in P
...
@@ -334,21 +334,21 @@ PyTorch and TensorFlow: any model saved as before can be loaded back either in P
If you would like to load your saved model in the other framework, first make sure it is installed:
If you would like to load your saved model in the other framework, first make sure it is installed:
```bash
```bash
pip install tensorflow
===PT-TF-SPLIT===
pip install torch
pip install torch
===PT-TF-SPLIT===
pip install tensorflow
```
```
Then, use the corresponding Auto class to load it like this:
Then, use the corresponding Auto class to load it like this:
```py
```py
>>> from transformers import TFAutoModel
>>> tokenizer = AutoTokenizer.from_pretrained(pt_save_directory)
>>> tf_model = TFAutoModel.from_pretrained(pt_save_directory, from_pt=True)
===PT-TF-SPLIT===
>>> from transformers import AutoModel
>>> from transformers import AutoModel
>>> tokenizer = AutoTokenizer.from_pretrained(tf_save_directory)
>>> tokenizer = AutoTokenizer.from_pretrained(tf_save_directory)
>>> pt_model = AutoModel.from_pretrained(tf_save_directory, from_tf=True)
>>> pt_model = AutoModel.from_pretrained(tf_save_directory, from_tf=True)
===PT-TF-SPLIT===
>>> from transformers import TFAutoModel
>>> tokenizer = AutoTokenizer.from_pretrained(pt_save_directory)
>>> tf_model = TFAutoModel.from_pretrained(pt_save_directory, from_pt=True)
```
```
Lastly, you can also ask the model to return all hidden states and all attention weights if you need them:
Lastly, you can also ask the model to return all hidden states and all attention weights if you need them:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment