Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
bitsandbytes
Commits
62441815
Commit
62441815
authored
Aug 08, 2022
by
Tim Dettmers
Browse files
Removed prod for Python <= 3.7 compatibility.
parent
26efb154
Changes
2
Hide whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
9 additions
and
7 deletions
+9
-7
bitsandbytes/autograd/_functions.py
bitsandbytes/autograd/_functions.py
+8
-6
setup.py
setup.py
+1
-1
No files found.
bitsandbytes/autograd/_functions.py
View file @
62441815
from
dataclasses
import
dataclass
import
operator
import
torch
import
math
import
bitsandbytes
as
bnb
import
bitsandbytes.functional
as
F
from
dataclasses
import
dataclass
from
functools
import
reduce
# Required in Python 3
def
prod
(
iterable
):
return
reduce
(
operator
.
mul
,
iterable
,
1
)
tensor
=
torch
.
Tensor
"""
...
...
@@ -12,8 +16,6 @@ tensor = torch.Tensor
This is particularly important for small models where outlier features
are less systematic and occur with low frequency.
"""
class
GlobalOutlierPooler
(
object
):
_instance
=
None
...
...
@@ -201,7 +203,7 @@ class MatMul8bitLt(torch.autograd.Function):
def
forward
(
ctx
,
A
,
B
,
out
=
None
,
state
=
MatmulLtState
()):
# default to pytorch behavior if inputs are empty
ctx
.
is_empty
=
False
if
math
.
prod
(
A
.
shape
)
==
0
:
if
prod
(
A
.
shape
)
==
0
:
ctx
.
is_empty
=
True
ctx
.
A
=
A
ctx
.
B
=
B
...
...
setup.py
View file @
62441815
...
...
@@ -18,7 +18,7 @@ def read(fname):
setup
(
name
=
f
"bitsandbytes"
,
version
=
f
"0.31.
4
"
,
version
=
f
"0.31.
5
"
,
author
=
"Tim Dettmers"
,
author_email
=
"dettmers@cs.washington.edu"
,
description
=
"8-bit optimizers and matrix multiplication routines."
,
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment