Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
bitsandbytes
Commits
0c5fa5a6
"git@developer.sourcefind.cn:OpenDAS/lmdeploy.git" did not exist on "4b3458f70afd4e5fb22ecbbee41577a2e352cfd9"
Commit
0c5fa5a6
authored
Oct 21, 2021
by
Tim Dettmers
Browse files
Fixed syntax and import error.
parent
1ec0d545
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
3 additions
and
1 deletion
+3
-1
bitsandbytes/optim/adam.py
bitsandbytes/optim/adam.py
+3
-1
No files found.
bitsandbytes/optim/adam.py
View file @
0c5fa5a6
...
@@ -2,6 +2,7 @@
...
@@ -2,6 +2,7 @@
#
#
# This source code is licensed under the MIT license found in the
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
# LICENSE file in the root directory of this source tree.
import
torch
from
bitsandbytes.optim.optimizer
import
Optimizer2State
from
bitsandbytes.optim.optimizer
import
Optimizer2State
import
bitsandbytes.functional
as
F
import
bitsandbytes.functional
as
F
...
@@ -49,7 +50,7 @@ class AnalysisAdam(torch.optim.Optimizer):
...
@@ -49,7 +50,7 @@ class AnalysisAdam(torch.optim.Optimizer):
amsgrad (boolean, optional): whether to use the AMSGrad variant of this
amsgrad (boolean, optional): whether to use the AMSGrad variant of this
algorithm from the paper `On the Convergence of Adam and Beyond`_
algorithm from the paper `On the Convergence of Adam and Beyond`_
.. _Adam
\
: A Method for Stochastic Optimization:
.. _Adam: A Method for Stochastic Optimization:
https://arxiv.org/abs/1412.6980
https://arxiv.org/abs/1412.6980
.. _On the Convergence of Adam and Beyond:
.. _On the Convergence of Adam and Beyond:
https://openreview.net/forum?id=ryQu7f-RZ
https://openreview.net/forum?id=ryQu7f-RZ
...
@@ -192,6 +193,7 @@ class AnalysisAdam(torch.optim.Optimizer):
...
@@ -192,6 +193,7 @@ class AnalysisAdam(torch.optim.Optimizer):
C2
=
F
.
quantize_no_absmax
(
exp_avg_sq
,
code
=
code2
)
C2
=
F
.
quantize_no_absmax
(
exp_avg_sq
,
code
=
code2
)
state2
=
F
.
dequantize_no_absmax
(
C2
,
code2
)
state2
=
F
.
dequantize_no_absmax
(
C2
,
code2
)
elif
self
.
analysis
==
'my-quantization-routine'
:
elif
self
.
analysis
==
'my-quantization-routine'
:
pass
# 1. get code
# 1. get code
# 2. quantize
# 2. quantize
# 3. dequantize
# 3. dequantize
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment