Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
510ad0a8
Unverified
Commit
510ad0a8
authored
May 04, 2023
by
peter-sk
Committed by
GitHub
May 04, 2023
Browse files
gpt2 multi-gpu fix (#23149)
Co-authored-by:
Prof. Peter Schneider-Kamp
<
jps@ordbogen.com
>
parent
adb0760b
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
2 additions
and
2 deletions
+2
-2
src/transformers/models/gpt2/modeling_gpt2.py
src/transformers/models/gpt2/modeling_gpt2.py
+2
-2
No files found.
src/transformers/models/gpt2/modeling_gpt2.py
View file @
510ad0a8
...
...
@@ -1670,9 +1670,9 @@ class GPT2ForQuestionAnswering(GPT2PreTrainedModel):
if
start_positions
is
not
None
and
end_positions
is
not
None
:
# If we are on multi-GPU, split add a dimension
if
len
(
start_positions
.
size
())
>
1
:
start_positions
=
start_positions
.
squeeze
(
-
1
)
start_positions
=
start_positions
.
squeeze
(
-
1
)
.
to
(
start_logits
.
device
)
if
len
(
end_positions
.
size
())
>
1
:
end_positions
=
end_positions
.
squeeze
(
-
1
)
end_positions
=
end_positions
.
squeeze
(
-
1
)
.
to
(
end_logits
.
device
)
# sometimes the start/end positions are outside our model inputs, we ignore these terms
ignored_index
=
start_logits
.
size
(
1
)
start_positions
=
start_positions
.
clamp
(
0
,
ignored_index
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment