Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
83076c16
Commit
83076c16
authored
Dec 18, 2019
by
Hongkun Yu
Committed by
A. Unique TensorFlower
Dec 18, 2019
Browse files
Fix bert readme. Always use keras_bert path.
PiperOrigin-RevId: 286327318
parent
1722b691
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
7 additions
and
7 deletions
+7
-7
official/nlp/bert/README.md
official/nlp/bert/README.md
+7
-7
No files found.
official/nlp/bert/README.md
View file @
83076c16
...
...
@@ -141,7 +141,7 @@ and unpack it to some directory `$GLUE_DIR`.
```
shell
export
GLUE_DIR
=
~/glue
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
tf_20
/uncased_L-24_H-1024_A-16
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
keras_bert
/uncased_L-24_H-1024_A-16
export
TASK_NAME
=
MNLI
export
OUTPUT_DIR
=
gs://some_bucket/datasets
...
...
@@ -172,7 +172,7 @@ The necessary files can be found here:
```
shell
export
SQUAD_DIR
=
~/squad
export
SQUAD_VERSION
=
v1.1
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
tf_20
/uncased_L-24_H-1024_A-16
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
keras_bert
/uncased_L-24_H-1024_A-16
export
OUTPUT_DIR
=
gs://some_bucket/datasets
python create_finetuning_data.py
\
...
...
@@ -190,7 +190,7 @@ python create_finetuning_data.py \
*
Cloud Storage
The unzipped pre-trained model files can also be found in the Google Cloud
Storage folder
`gs://cloud-tpu-checkpoints/bert/
tf_20
`
. For example:
Storage folder
`gs://cloud-tpu-checkpoints/bert/
keras_bert
`
. For example:
```
shell
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/keras_bert/uncased_L-24_H-1024_A-16
...
...
@@ -221,7 +221,7 @@ For GPU memory of 16GB or smaller, you may try to use `BERT-Base`
(uncased_L-12_H-768_A-12).
```
shell
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
tf_20
/uncased_L-24_H-1024_A-16
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
keras_bert
/uncased_L-24_H-1024_A-16
export
MODEL_DIR
=
gs://some_bucket/my_output_dir
export
GLUE_DIR
=
gs://some_bucket/datasets
export
TASK
=
MRPC
...
...
@@ -246,7 +246,7 @@ To use TPU, you only need to switch distribution strategy type to `tpu` with TPU
information and use remote storage for model checkpoints.
```
shell
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
tf_20
/uncased_L-24_H-1024_A-16
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
keras_bert
/uncased_L-24_H-1024_A-16
export
TPU_IP_ADDRESS
=
'???'
export
MODEL_DIR
=
gs://some_bucket/my_output_dir
export
GLUE_DIR
=
gs://some_bucket/datasets
...
...
@@ -278,7 +278,7 @@ For GPU memory of 16GB or smaller, you may try to use `BERT-Base`
(uncased_L-12_H-768_A-12).
```
shell
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
tf_20
/uncased_L-24_H-1024_A-16
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
keras_bert
/uncased_L-24_H-1024_A-16
export
SQUAD_DIR
=
gs://some_bucket/datasets
export
MODEL_DIR
=
gs://some_bucket/my_output_dir
export
SQUAD_VERSION
=
v1.1
...
...
@@ -302,7 +302,7 @@ To use TPU, you need switch distribution strategy type to `tpu` with TPU
information.
```
shell
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
tf_20
/uncased_L-24_H-1024_A-16
export
BERT_BASE_DIR
=
gs://cloud-tpu-checkpoints/bert/
keras_bert
/uncased_L-24_H-1024_A-16
export
TPU_IP_ADDRESS
=
'???'
export
MODEL_DIR
=
gs://some_bucket/my_output_dir
export
SQUAD_DIR
=
gs://some_bucket/datasets
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment