Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
e99bc87e
Commit
e99bc87e
authored
Mar 05, 2019
by
Catalin Voss
Browse files
Merge branch 'patch-1' into patch-2
parents
9775b2eb
c0cf0a04
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
examples/run_openai_gpt.py
examples/run_openai_gpt.py
+1
-1
No files found.
examples/run_openai_gpt.py
View file @
e99bc87e
...
@@ -163,7 +163,7 @@ def main():
...
@@ -163,7 +163,7 @@ def main():
datasets
=
(
train_dataset
,
eval_dataset
)
datasets
=
(
train_dataset
,
eval_dataset
)
encoded_datasets
=
tokenize_and_encode
(
datasets
)
encoded_datasets
=
tokenize_and_encode
(
datasets
)
# Compute the m
e
x input length for the Transformer
# Compute the m
a
x input length for the Transformer
max_length
=
model
.
config
.
n_positions
//
2
-
2
max_length
=
model
.
config
.
n_positions
//
2
-
2
input_length
=
max
(
len
(
story
[:
max_length
])
+
max
(
len
(
cont1
[:
max_length
]),
len
(
cont2
[:
max_length
]))
+
3
\
input_length
=
max
(
len
(
story
[:
max_length
])
+
max
(
len
(
cont1
[:
max_length
]),
len
(
cont2
[:
max_length
]))
+
3
\
for
dataset
in
encoded_datasets
for
story
,
cont1
,
cont2
,
_
in
dataset
)
for
dataset
in
encoded_datasets
for
story
,
cont1
,
cont2
,
_
in
dataset
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment