Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
5dfd1906
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "0a921b64595448d27af7ca66cebd153e8860ce95"
Unverified
Commit
5dfd1906
authored
Nov 12, 2018
by
Thomas Wolf
Committed by
GitHub
Nov 12, 2018
Browse files
fix typo in readme
parent
fa1aa81f
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
README.md
README.md
+2
-2
No files found.
README.md
View file @
5dfd1906
...
@@ -210,12 +210,12 @@ For example, fine-tuning BERT-large on SQuAD can be done on a server with 4 k-80
...
@@ -210,12 +210,12 @@ For example, fine-tuning BERT-large on SQuAD can be done on a server with 4 k-80
```
bash
```
bash
{
"exact_match"
: 84.56953642384106,
"f1"
: 91.04028647786927
}
{
"exact_match"
: 84.56953642384106,
"f1"
: 91.04028647786927
}
```
```
To get these results
that
we used a combination of:
To get these results we used a combination of:
-
multi-GPU training (automatically activated on a multi-GPU server),
-
multi-GPU training (automatically activated on a multi-GPU server),
-
2 steps of gradient accumulation and
-
2 steps of gradient accumulation and
-
perform the optimization step on CPU to store Adam's averages in RAM.
-
perform the optimization step on CPU to store Adam's averages in RAM.
Here are the full list of hyper-parameters
we used
for this run:
Here are the full list of hyper-parameters for this run:
```
bash
```
bash
python ./run_squad.py
\
python ./run_squad.py
\
--vocab_file
$BERT_LARGE_DIR
/vocab.txt
\
--vocab_file
$BERT_LARGE_DIR
/vocab.txt
\
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment