Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
2c55568c
Commit
2c55568c
authored
Nov 03, 2018
by
VictorSanh
Browse files
`scatter_` and `scatter`
parent
a6efe123
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
1 addition
and
1 deletion
+1
-1
modeling_pytorch.py
modeling_pytorch.py
+1
-1
No files found.
modeling_pytorch.py
View file @
2c55568c
...
@@ -502,7 +502,7 @@ class BertForQuestionAnswering(nn.Module):
...
@@ -502,7 +502,7 @@ class BertForQuestionAnswering(nn.Module):
def
compute_loss
(
logits
,
positions
):
def
compute_loss
(
logits
,
positions
):
max_position
=
positions
.
max
().
item
()
max_position
=
positions
.
max
().
item
()
one_hot
=
torch
.
FloatTensor
(
batch_size
,
max
(
max_position
,
seq_length
)
+
1
).
zero_
()
one_hot
=
torch
.
FloatTensor
(
batch_size
,
max
(
max_position
,
seq_length
)
+
1
).
zero_
()
one_hot
=
one_hot
.
scatter
(
1
,
positions
.
cpu
(),
1
)
# Second argument need to be LongTensor and not cuda.LongTensor
one_hot
=
one_hot
.
scatter
_
(
1
,
positions
.
cpu
(),
1
)
# Second argument need to be LongTensor and not cuda.LongTensor
one_hot
=
one_hot
[:,
:
seq_length
].
to
(
input_ids
.
device
)
one_hot
=
one_hot
[:,
:
seq_length
].
to
(
input_ids
.
device
)
log_probs
=
nn
.
functional
.
log_softmax
(
logits
,
dim
=
-
1
).
view
(
batch_size
,
seq_length
)
log_probs
=
nn
.
functional
.
log_softmax
(
logits
,
dim
=
-
1
).
view
(
batch_size
,
seq_length
)
loss
=
-
torch
.
mean
(
torch
.
sum
(
one_hot
*
log_probs
),
dim
=
-
1
)
loss
=
-
torch
.
mean
(
torch
.
sum
(
one_hot
*
log_probs
),
dim
=
-
1
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment