Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
ModelZoo
ResNet50_tensorflow
Commits
5c15ce77
Commit
5c15ce77
authored
Nov 19, 2019
by
Hongkun Yu
Committed by
A. Unique TensorFlower
Nov 19, 2019
Browse files
Fix a mistake in previous change
PiperOrigin-RevId: 281409019
parent
252e6384
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
4 deletions
+2
-4
official/nlp/bert/run_pretraining.py
official/nlp/bert/run_pretraining.py
+2
-4
No files found.
official/nlp/bert/run_pretraining.py
View file @
5c15ce77
...
@@ -59,12 +59,10 @@ def get_pretrain_dataset_fn(input_file_pattern, seq_length,
...
@@ -59,12 +59,10 @@ def get_pretrain_dataset_fn(input_file_pattern, seq_length,
"""Returns input dataset from input file string."""
"""Returns input dataset from input file string."""
def
_dataset_fn
(
ctx
=
None
):
def
_dataset_fn
(
ctx
=
None
):
"""Returns tf.data.Dataset for distributed BERT pretraining."""
"""Returns tf.data.Dataset for distributed BERT pretraining."""
input_files
=
[]
input_patterns
=
input_file_pattern
.
split
(
','
)
for
input_pattern
in
input_file_pattern
.
split
(
','
):
input_files
.
extend
(
tf
.
io
.
gfile
.
glob
(
input_pattern
))
batch_size
=
ctx
.
get_per_replica_batch_size
(
global_batch_size
)
batch_size
=
ctx
.
get_per_replica_batch_size
(
global_batch_size
)
train_dataset
=
input_pipeline
.
create_pretrain_dataset
(
train_dataset
=
input_pipeline
.
create_pretrain_dataset
(
input_
file
s
,
input_
pattern
s
,
seq_length
,
seq_length
,
max_predictions_per_seq
,
max_predictions_per_seq
,
batch_size
,
batch_size
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment