Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
chenpangpang
transformers
Commits
f3bda235
"...git@developer.sourcefind.cn:chenpangpang/transformers.git" did not exist on "8e5d1619b3e57367701d74647e87b95f8dba5409"
Commit
f3bda235
authored
Feb 04, 2019
by
Thibault Fevry
Browse files
Only keep the active part mof the loss for token classification
parent
8f8bbd4a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
1 deletion
+8
-1
pytorch_pretrained_bert/modeling.py
pytorch_pretrained_bert/modeling.py
+8
-1
No files found.
pytorch_pretrained_bert/modeling.py
View file @
f3bda235
...
@@ -1025,7 +1025,14 @@ class BertForTokenClassification(PreTrainedBertModel):
...
@@ -1025,7 +1025,14 @@ class BertForTokenClassification(PreTrainedBertModel):
if
labels
is
not
None
:
if
labels
is
not
None
:
loss_fct
=
CrossEntropyLoss
()
loss_fct
=
CrossEntropyLoss
()
loss
=
loss_fct
(
logits
.
view
(
-
1
,
self
.
num_labels
),
labels
.
view
(
-
1
))
# Only keep active parts of the loss
if
attention_mask
is
not
None
:
active_loss
=
attention_mask
.
view
(
-
1
)
==
1
active_logits
=
logits
.
view
(
-
1
,
self
.
num_labels
)[
active_loss
]
active_labels
=
labels
.
view
(
-
1
)[
active_loss
]
loss
=
loss_fct
(
active_logits
,
active_labels
)
else
:
loss
=
loss_fct
(
logits
.
view
(
-
1
,
self
.
num_labels
),
labels
.
view
(
-
1
))
return
loss
return
loss
else
:
else
:
return
logits
return
logits
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment