Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
bitsandbytes
Commits
656de8ed
Commit
656de8ed
authored
Aug 23, 2022
by
dbaranchuk
Browse files
minor fixes
parent
1753aa04
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
2 additions
and
2 deletions
+2
-2
bitsandbytes/autograd/_functions.py
bitsandbytes/autograd/_functions.py
+1
-1
bitsandbytes/nn/modules.py
bitsandbytes/nn/modules.py
+1
-1
No files found.
bitsandbytes/autograd/_functions.py
View file @
656de8ed
...
...
@@ -368,7 +368,7 @@ class MatMul8bitLt(torch.autograd.Function):
Bt
=
(
CB
*
SCB
).
t
().
contiguous
()
CBt
=
(
Bt
/
SCBt
).
t
().
to
(
torch
.
int8
)
# intentionally, do not store CxBt in
to
state
# intentionally, do not store CxBt in state
CxBt
,
SBt
=
F
.
transform
(
CBt
,
to_order
=
formatB
,
transpose
=
True
)
...
...
bitsandbytes/nn/modules.py
View file @
656de8ed
...
...
@@ -212,7 +212,7 @@ class Int8Params(torch.nn.Parameter):
)
new_param
.
CB
=
self
.
CB
new_param
.
SCB
=
self
.
SCB
new_param
.
SCB
=
self
.
SCBt
new_param
.
SCB
t
=
self
.
SCBt
return
new_param
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment