Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
MLPerf_BERT_paddle
Commits
0971c45e
Commit
0971c45e
authored
Apr 11, 2023
by
liangjing
Browse files
update readme.md
parent
2f18ce20
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
10 additions
and
0 deletions
+10
-0
README.md
README.md
+10
-0
No files found.
README.md
View file @
0971c45e
...
...
@@ -11,6 +11,16 @@ BERT-large是BERT模型的一个更大、更复杂的版本。与BERT-base相比
BERT-large包含24个Transformer编码器,每个编码器有1024个隐藏层,总共包含340M个参数。在预训练阶段,BERT-large使用更多的未标记的文本数据进行预训练,并使用Masked Language Model(MLM)和Next Sentence Prediction(NSP)两个任务来优化模型。BERT-large的预训练阶段比BERT-base更复杂,并且需要更长的时间来训练。
## 目标精度
0.
72 Mask-LM accuracy
## MLPerf代码参考版本
版本:v1.0
原始代码位置:https://github.com/mlcommons/training_results_v1.0/tree/master/NVIDIA/benchmarks/bert/implementations/pytorch
## 数据集
模型训练的数据集来自Wikipedia 2020/01/01,即一种常用的自然语言处理数据集,它包含了维基百科上的文章和对应的摘要(即第一段内容),可用于各种文本相关的任务,例如文本分类、文本摘要、命名实体识别等。
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment