Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
b8697bc6
Commit
b8697bc6
authored
May 21, 2021
by
Sylvain Gugger
Browse files
Avoid TensorFlow import in Trainer
parent
e2c1dd09
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
src/transformers/modelcard.py
src/transformers/modelcard.py
+3
-1
No files found.
src/transformers/modelcard.py
View file @
b8697bc6
...
...
@@ -40,7 +40,6 @@ from .file_utils import (
is_tokenizers_available
,
is_torch_available
,
)
from
.models.auto.configuration_auto
import
ALL_PRETRAINED_CONFIG_ARCHIVE_MAP
from
.training_args
import
ParallelMode
from
.utils
import
logging
...
...
@@ -145,6 +144,9 @@ class ModelCard:
modelcard = ModelCard.from_pretrained('bert-base-uncased', output_attentions=True, foo=False)
"""
# This imports every model so let's do it dynamically here.
from
transformers.models.auto.configuration_auto
import
ALL_PRETRAINED_CONFIG_ARCHIVE_MAP
cache_dir
=
kwargs
.
pop
(
"cache_dir"
,
None
)
proxies
=
kwargs
.
pop
(
"proxies"
,
None
)
find_from_standard_name
=
kwargs
.
pop
(
"find_from_standard_name"
,
True
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment