Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e1b2949a
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "e1638dce16dfe2854bf8277deafd1fc2b5dc7c63"
Commit
e1b2949a
authored
Oct 03, 2019
by
drc10723
Committed by
Victor SANH
Oct 03, 2019
Browse files
DistillBert Documentation Code Example fixes
parent
e2ae9c0b
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
3 additions
and
5 deletions
+3
-5
transformers/modeling_distilbert.py
transformers/modeling_distilbert.py
+1
-1
transformers/modeling_tf_distilbert.py
transformers/modeling_tf_distilbert.py
+2
-4
No files found.
transformers/modeling_distilbert.py
View file @
e1b2949a
...
@@ -649,7 +649,7 @@ class DistilBertForQuestionAnswering(DistilBertPreTrainedModel):
...
@@ -649,7 +649,7 @@ class DistilBertForQuestionAnswering(DistilBertPreTrainedModel):
start_positions = torch.tensor([1])
start_positions = torch.tensor([1])
end_positions = torch.tensor([3])
end_positions = torch.tensor([3])
outputs = model(input_ids, start_positions=start_positions, end_positions=end_positions)
outputs = model(input_ids, start_positions=start_positions, end_positions=end_positions)
loss, start_scores, end_scores = outputs[:
2
]
loss, start_scores, end_scores = outputs[:
3
]
"""
"""
def
__init__
(
self
,
config
):
def
__init__
(
self
,
config
):
...
...
transformers/modeling_tf_distilbert.py
View file @
e1b2949a
...
@@ -603,7 +603,7 @@ class TFDistilBertForMaskedLM(TFDistilBertPreTrainedModel):
...
@@ -603,7 +603,7 @@ class TFDistilBertForMaskedLM(TFDistilBertPreTrainedModel):
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased')
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased')
model = TFDistilBertForMaskedLM.from_pretrained('distilbert-base-uncased')
model = TFDistilBertForMaskedLM.from_pretrained('distilbert-base-uncased')
input_ids = tf.constant(tokenizer.encode("Hello, my dog is cute"))[None, :] # Batch size 1
input_ids = tf.constant(tokenizer.encode("Hello, my dog is cute"))[None, :] # Batch size 1
outputs = model(input_ids
, masked_lm_labels=input_ids
)
outputs = model(input_ids)
prediction_scores = outputs[0]
prediction_scores = outputs[0]
"""
"""
...
@@ -715,9 +715,7 @@ class TFDistilBertForQuestionAnswering(TFDistilBertPreTrainedModel):
...
@@ -715,9 +715,7 @@ class TFDistilBertForQuestionAnswering(TFDistilBertPreTrainedModel):
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased')
tokenizer = DistilBertTokenizer.from_pretrained('distilbert-base-uncased')
model = TFDistilBertForQuestionAnswering.from_pretrained('distilbert-base-uncased')
model = TFDistilBertForQuestionAnswering.from_pretrained('distilbert-base-uncased')
input_ids = tf.constant(tokenizer.encode("Hello, my dog is cute"))[None, :] # Batch size 1
input_ids = tf.constant(tokenizer.encode("Hello, my dog is cute"))[None, :] # Batch size 1
start_positions = tf.constant([1])
outputs = model(input_ids)
end_positions = tf.constant([3])
outputs = model(input_ids, start_positions=start_positions, end_positions=end_positions)
start_scores, end_scores = outputs[:2]
start_scores, end_scores = outputs[:2]
"""
"""
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment