Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
wangsen
paddle_dbnet
Commits
5b333406
Unverified
Commit
5b333406
authored
Mar 07, 2022
by
littletomatodonkey
Committed by
GitHub
Mar 07, 2022
Browse files
fix kldiv when stop grad is trur (#5643)
parent
db608932
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
9 additions
and
4 deletions
+9
-4
ppocr/losses/basic_loss.py
ppocr/losses/basic_loss.py
+9
-4
No files found.
ppocr/losses/basic_loss.py
View file @
5b333406
...
...
@@ -95,9 +95,15 @@ class DMLLoss(nn.Layer):
self
.
act
=
None
self
.
use_log
=
use_log
self
.
jskl_loss
=
KLJSLoss
(
mode
=
"js"
)
def
_kldiv
(
self
,
x
,
target
):
eps
=
1.0e-10
loss
=
target
*
(
paddle
.
log
(
target
+
eps
)
-
x
)
# batch mean loss
loss
=
paddle
.
sum
(
loss
)
/
loss
.
shape
[
0
]
return
loss
def
forward
(
self
,
out1
,
out2
):
if
self
.
act
is
not
None
:
out1
=
self
.
act
(
out1
)
...
...
@@ -106,9 +112,8 @@ class DMLLoss(nn.Layer):
# for recognition distillation, log is needed for feature map
log_out1
=
paddle
.
log
(
out1
)
log_out2
=
paddle
.
log
(
out2
)
loss
=
(
F
.
kl_div
(
log_out1
,
out2
,
reduction
=
'batchmean'
)
+
F
.
kl_div
(
log_out2
,
out1
,
reduction
=
'batchmean'
))
/
2.0
loss
=
(
self
.
_kldiv
(
log_out1
,
out2
)
+
self
.
_kldiv
(
log_out2
,
out1
))
/
2.0
else
:
# for detection distillation log is not needed
loss
=
self
.
jskl_loss
(
out1
,
out2
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment