Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
nni
Commits
52b70bc1
"docs/ZH_CN/git@developer.sourcefind.cn:Wenxuan/LightX2V.git" did not exist on "857dab1f2f4d6cb2a4624959298bf0f8aeb7ef9e"
Unverified
Commit
52b70bc1
authored
Aug 22, 2021
by
chenbohua3
Committed by
GitHub
Aug 22, 2021
Browse files
restore weight by default for qat quantizer (#3992)
parent
5c9797a2
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
6 additions
and
7 deletions
+6
-7
nni/algorithms/compression/pytorch/quantization/quantizers.py
...algorithms/compression/pytorch/quantization/quantizers.py
+6
-7
No files found.
nni/algorithms/compression/pytorch/quantization/quantizers.py
View file @
52b70bc1
...
...
@@ -563,14 +563,13 @@ class QAT_Quantizer(Quantizer):
calibration_config
[
name
][
'tracked_max_input'
]
=
float
(
module
.
tracked_max_input
)
# Recover weight/bias for batch normalization folding
if
hasattr
(
module
,
BN_FOLD_TAG
):
actual_weight
=
getattr
(
module
,
'old_weight'
,
None
)
if
actual_weight
is
None
:
logger
.
warning
(
"Can not recover weight for layer %s. "
"This may lead to a wrong accuracy performance on the backend."
,
name
)
delattr
(
module
,
'weight'
)
module
.
register_parameter
(
'weight'
,
actual_weight
)
if
hasattr
(
module
,
BN_FOLD_TAG
):
actual_bias
=
getattr
(
module
,
'old_bias'
,
None
)
delattr
(
module
,
'bias'
)
if
actual_bias
is
not
None
:
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment