Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
bitsandbytes
Commits
59593890
Commit
59593890
authored
Aug 11, 2025
by
Matthew Douglas
Browse files
Temporary updates for release
parent
19fe95ac
Changes
4
Hide whitespace changes
Inline
Side-by-side
Showing
4 changed files
with
13 additions
and
22 deletions
+13
-22
.github/workflows/python-package.yml
.github/workflows/python-package.yml
+2
-1
README.md
README.md
+10
-10
bitsandbytes/__init__.py
bitsandbytes/__init__.py
+0
-11
pyproject.toml
pyproject.toml
+1
-0
No files found.
.github/workflows/python-package.yml
View file @
59593890
...
@@ -104,6 +104,7 @@ jobs:
...
@@ -104,6 +104,7 @@ jobs:
retention-days
:
7
retention-days
:
7
build-shared-libs-rocm
:
build-shared-libs-rocm
:
if
:
false
# Temporarily disabled
strategy
:
strategy
:
matrix
:
matrix
:
os
:
[
ubuntu-22.04
]
os
:
[
ubuntu-22.04
]
...
@@ -151,7 +152,7 @@ jobs:
...
@@ -151,7 +152,7 @@ jobs:
needs
:
needs
:
-
build-shared-libs
-
build-shared-libs
-
build-shared-libs-cuda
-
build-shared-libs-cuda
-
build-shared-libs-rocm
#
- build-shared-libs-rocm
strategy
:
strategy
:
matrix
:
matrix
:
os
:
[
ubuntu-22.04
,
ubuntu-22.04-arm
,
windows-latest
,
macos-latest
]
os
:
[
ubuntu-22.04
,
ubuntu-22.04-arm
,
windows-latest
,
macos-latest
]
...
...
README.md
View file @
59593890
...
@@ -26,7 +26,7 @@ bitsandbytes has the following minimum requirements for all platforms:
...
@@ -26,7 +26,7 @@ bitsandbytes has the following minimum requirements for all platforms:
#### Accelerator support:
#### Accelerator support:
<small>
Note: this table reflects the status of the current development branch. For the latest stable release, see the
<small>
Note: this table reflects the status of the current development branch. For the latest stable release, see the
[
document in the
v
0.4
6
.0 tag
](
https://github.com/bitsandbytes-foundation/bitsandbytes/blob/0.4
6
.0/README.md#accelerator-support
)
.
[
document in the 0.4
7
.0 tag
](
https://github.com/bitsandbytes-foundation/bitsandbytes/blob/0.4
7
.0/README.md#accelerator-support
)
.
</small>
</small>
##### Legend:
##### Legend:
...
@@ -73,9 +73,9 @@ bitsandbytes has the following minimum requirements for all platforms:
...
@@ -73,9 +73,9 @@ bitsandbytes has the following minimum requirements for all platforms:
CDNA: gfx90a, gfx942
<br>
CDNA: gfx90a, gfx942
<br>
RDNA: gfx1100
RDNA: gfx1100
</td>
</td>
<td>
✅
</td>
<td>
🚧
</td>
<td>
〰️
</td>
<td>
🚧
</td>
<td>
✅
</td>
<td>
🚧
</td>
</tr>
</tr>
<tr>
<tr>
<td></td>
<td></td>
...
@@ -85,16 +85,16 @@ bitsandbytes has the following minimum requirements for all platforms:
...
@@ -85,16 +85,16 @@ bitsandbytes has the following minimum requirements for all platforms:
Arc A-Series (Alchemist)
<br>
Arc A-Series (Alchemist)
<br>
Arc B-Series (Battlemage)
Arc B-Series (Battlemage)
</td>
</td>
<td>
✅
</td>
<td>
🚧
</td>
<td>
✅
</td>
<td>
🚧
</td>
<td>
🚧
</td>
<td>
🚧
</td>
</tr>
</tr>
<tr>
<tr>
<td></td>
<td></td>
<td>
🟪 Intel Gaudi
<br><code>
hpu
</code></td>
<td>
🟪 Intel Gaudi
<br><code>
hpu
</code></td>
<td>
Gaudi1, Gaudi2, Gaudi3
</td>
<td>
Gaudi1, Gaudi2, Gaudi3
</td>
<td>
✅
</td>
<td>
🚧
</td>
<td>
〰️
</td>
<td>
🚧
</td>
<td>
❌
</td>
<td>
❌
</td>
</tr>
</tr>
<tr>
<tr>
...
@@ -139,8 +139,8 @@ bitsandbytes has the following minimum requirements for all platforms:
...
@@ -139,8 +139,8 @@ bitsandbytes has the following minimum requirements for all platforms:
Arc A-Series (Alchemist)
<br>
Arc A-Series (Alchemist)
<br>
Arc B-Series (Battlemage)
Arc B-Series (Battlemage)
</td>
</td>
<td>
✅
</td>
<td>
🚧
</td>
<td>
✅
</td>
<td>
🚧
</td>
<td>
🚧
</td>
<td>
🚧
</td>
</tr>
</tr>
<tr>
<tr>
...
...
bitsandbytes/__init__.py
View file @
59593890
...
@@ -35,17 +35,6 @@ supported_torch_devices = {
...
@@ -35,17 +35,6 @@ supported_torch_devices = {
if
torch
.
cuda
.
is_available
():
if
torch
.
cuda
.
is_available
():
from
.backends.cuda
import
ops
as
cuda_ops
from
.backends.cuda
import
ops
as
cuda_ops
if
hasattr
(
torch
,
"xpu"
)
and
torch
.
xpu
.
is_available
():
from
.backends.xpu
import
ops
as
xpu_ops
if
importlib
.
util
.
find_spec
(
"habana_frameworks"
)
and
importlib
.
util
.
find_spec
(
"habana_frameworks.torch"
):
# In case not automatically imported
import
habana_frameworks.torch
if
hasattr
(
torch
,
"hpu"
)
and
torch
.
hpu
.
is_available
():
from
.backends.hpu
import
ops
as
hpu_ops
def
_import_backends
():
def
_import_backends
():
"""
"""
...
...
pyproject.toml
View file @
59593890
...
@@ -74,6 +74,7 @@ test = [
...
@@ -74,6 +74,7 @@ test = [
package-data
=
{
"*"
=
["libbitsandbytes*.*"]
}
package-data
=
{
"*"
=
["libbitsandbytes*.*"]
}
[tool.setuptools.packages.find]
[tool.setuptools.packages.find]
exclude
=
[
"*backends.xpu"
,
"*backends.hpu"
,
"*backends.triton"
]
include
=
["bitsandbytes*"]
include
=
["bitsandbytes*"]
[tool.setuptools.dynamic]
[tool.setuptools.dynamic]
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment