Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
bitsandbytes
Commits
ef2936a9
"...api/git@developer.sourcefind.cn:renzhc/diffusers_dcu.git" did not exist on "18fc40c169a82da2fca188b5d0083bda6ac044ab"
Commit
ef2936a9
authored
Aug 24, 2022
by
dbaranchuk
Browse files
delete CxB from state
parent
876387dc
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
4 additions
and
5 deletions
+4
-5
bitsandbytes/nn/modules.py
bitsandbytes/nn/modules.py
+4
-5
No files found.
bitsandbytes/nn/modules.py
View file @
ef2936a9
...
@@ -260,11 +260,10 @@ class Linear8bitLt(nn.Linear):
...
@@ -260,11 +260,10 @@ class Linear8bitLt(nn.Linear):
out
=
bnb
.
matmul
(
x
,
self
.
weight
,
bias
=
self
.
bias
,
state
=
self
.
state
)
out
=
bnb
.
matmul
(
x
,
self
.
weight
,
bias
=
self
.
bias
,
state
=
self
.
state
)
# if not self.state.has_fp16_weights and self.state.CB is not None:
if
not
self
.
state
.
has_fp16_weights
and
self
.
state
.
CxB
is
not
None
:
# we converted 8-bit row major to turing/ampere format in the first inference pass
# In this version, we convert 8-bit row major to turing/ampere format at each inference pass
# we no longer need the row-major weight
# Thus, we delete CxB from the state. TODO: do not store it in the state in the first place.
# del self.state.CB
del
self
.
state
.
CxB
# self.weight.data = self.state.CxB
return
out
return
out
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment