Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
b59043bf
Commit
b59043bf
authored
Jul 16, 2019
by
thomwolf
Browse files
update readme
parent
edc79acb
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
3 deletions
+6
-3
README.md
README.md
+6
-3
No files found.
README.md
View file @
b59043bf
...
...
@@ -89,15 +89,18 @@ BERT_MODEL_CLASSES = [BertModel, BertForPreTraining, BertForMaskedLM, BertForNex
BertForSequenceClassification
,
BertForMultipleChoice
,
BertForTokenClassification
,
BertForQuestionAnswering
]
# All the classes for an architecture can be loaded from pretrained weights for this architecture
# Note that additional weights added for fine-tuning are only initialized and need to be trained on the down-stream task
# All the classes for an architecture can be initiated from pretrained weights for this architecture
# Note that additional weights added for fine-tuning are only initialized
# and need to be trained on the down-stream task
tokenizer
=
BertTokenizer
.
from_pretrained
(
'bert-base-uncased'
)
for
model_class
in
BERT_MODEL_CLASSES
:
# Load pretrained model/tokenizer
model
=
model_class
.
from_pretrained
(
'bert-base-uncased'
)
# Models can return full list of hidden-states & attentions weights at each layer
model
=
model_class
.
from_pretrained
(
pretrained_weights
,
output_hidden_states
=
True
,
output_attentions
=
True
)
model
=
model_class
.
from_pretrained
(
pretrained_weights
,
output_hidden_states
=
True
,
output_attentions
=
True
)
input_ids
=
torch
.
tensor
([
tokenizer
.
encode
(
"Let's see all hidden-states and attentions on this text"
)])
all_hidden_states
,
all_attentions
=
model
(
input_ids
)[
-
2
:]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment