Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
ebaacba3
"git@developer.sourcefind.cn:modelzoo/latte_pytorch.git" did not exist on "601eab0669698bf2f261ccf2a55c6c12b090702a"
Commit
ebaacba3
authored
Nov 26, 2018
by
thomwolf
Browse files
fixing typo in docstring
parent
870d7163
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
pytorch_pretrained_bert/modeling.py
pytorch_pretrained_bert/modeling.py
+1
-1
No files found.
pytorch_pretrained_bert/modeling.py
View file @
ebaacba3
...
@@ -781,7 +781,7 @@ class BertForNextSentencePrediction(PreTrainedBertModel):
...
@@ -781,7 +781,7 @@ class BertForNextSentencePrediction(PreTrainedBertModel):
# Already been converted into WordPiece token ids
# Already been converted into WordPiece token ids
input_ids = torch.LongTensor([[31, 51, 99], [15, 5, 0]])
input_ids = torch.LongTensor([[31, 51, 99], [15, 5, 0]])
input_mask = torch.LongTensor([[1, 1, 1], [1, 1, 0]])
input_mask = torch.LongTensor([[1, 1, 1], [1, 1, 0]])
token_type_ids = torch.LongTensor([[0, 0, 1], [0,
2
, 0]])
token_type_ids = torch.LongTensor([[0, 0, 1], [0,
1
, 0]])
config = BertConfig(vocab_size=32000, hidden_size=512,
config = BertConfig(vocab_size=32000, hidden_size=512,
num_hidden_layers=8, num_attention_heads=6, intermediate_size=1024)
num_hidden_layers=8, num_attention_heads=6, intermediate_size=1024)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment