installation.mdx 10.9 KB
Newer Older
1
# Installation Guide
Titus's avatar
Titus committed
2

3
4
5
6
Welcome to the installation guide for the `bitsandbytes` library! This document provides step-by-step instructions to install `bitsandbytes` across various platforms and hardware configurations.

We provide official support for NVIDIA GPUs, CPUs, Intel XPUs, and Intel Gaudi platforms. We also have experimental support for
additional platforms such as AMD ROCm.
Younes Belkada's avatar
Younes Belkada committed
7

8
## Table of Contents
9

10
11
- [System Requirements](#requirements)
- [NVIDIA CUDA](#cuda)
12
13
  - [Installation via PyPI](#cuda-pip)
  - [Compile from Source](#cuda-compile)
14
15
16
17
18
19
20
21
22
23
24
25
26
- [Intel XPU](#xpu)
  - [Installation via PyPI](#xpu-pip)
- [Intel Gaudi](#gaudi)
  - [Installation via PyPI](#gaudi-pip)
- [CPU](#cpu)
  - [Installation via PyPI](#cpu-pip)
  - [Compile from Source](#cpu-compile)
- [AMD ROCm (Preview)](#rocm-preview)
- [Preview Wheels](#preview-wheels)

## System Requirements[[requirements]]

These are the minimum requirements for `bitsandbytes` across all platforms. Please be aware that some compute platforms may impose more strict requirements.
27

28
29
30
31
* Python >= 3.9
* PyTorch >= 2.3

## NVIDIA CUDA[[cuda]]
Younes Belkada's avatar
Younes Belkada committed
32

33
34
`bitsandbytes` is currently supported on NVIDIA GPUs with [Compute Capability](https://developer.nvidia.com/cuda-gpus) 6.0+.
The library can be built using CUDA Toolkit versions as old as **11.8**.
35

36
37
| **Feature**                     | **CC Required** | **Example Hardware Requirement**            |
|---------------------------------|-----------------|---------------------------------------------|
38
39
40
| LLM.int8()                      | 7.5+            | Turing (RTX 20 series, T4) or newer GPUs    |
| 8-bit optimizers/quantization   | 6.0+            | Pascal (GTX 10X0 series, P100) or newer GPUs|
| NF4/FP4 quantization            | 6.0+            | Pascal (GTX 10X0 series, P100) or newer GPUs|
Titus's avatar
Titus committed
41

42

Steven Liu's avatar
Steven Liu committed
43
> [!WARNING]
44
45
46
> Support for Maxwell GPUs is deprecated and will be removed in a future release.
> Maxwell support is not included in PyPI distributions from `v0.48.0` on and must be built from source.
> For the best results, a Turing generation device or newer is recommended.
Younes Belkada's avatar
Younes Belkada committed
47

48

49
### Installation via PyPI[[cuda-pip]]
50

51
This is the most straightforward and recommended installation option.
52

53
The currently distributed `bitsandbytes` packages are built with the following configurations:
54

55
56
| **OS**             | **CUDA Toolkit** | **Host Compiler**    | **Targets**
|--------------------|------------------|----------------------|--------------
57
58
| **Linux x86-64**   | 11.8 - 12.6      | GCC 11.2             | sm60, sm70, sm75, sm80, sm86, sm89, sm90
| **Linux x86-64**   | 12.8 - 12.9      | GCC 11.2             | sm70, sm75, sm80, sm86, sm89, sm90, sm100, sm120
59
| **Linux x86-64**   | 13.0             | GCC 11.2             | sm75, sm80, sm86, sm89, sm90, sm100, sm110, sm120
60
| **Linux aarch64**  | 11.8 - 12.6      | GCC 11.2             | sm75, sm80, sm90
61
| **Linux aarch64**  | 12.8 - 13.0      | GCC 11.2             | sm75, sm80, sm90, sm100, sm120
62
| **Windows x86-64** | 11.8 - 12.6      | MSVC 19.43+ (VS2022) | sm50, sm60, sm75, sm80, sm86, sm89, sm90
63
| **Windows x86-64** | 12.8 - 12.9      | MSVC 19.43+ (VS2022) | sm70, sm75, sm80, sm86, sm89, sm90, sm100, sm120
64
| **Windows x86-64** | 13.0             | MSVC 19.43+ (VS2022) | sm75, sm80, sm86, sm89, sm90, sm100, sm120
65

66
67
68
The Linux build has a minimum glibc version of 2.24.

Use `pip` or `uv` to install the latest release:
69

70
71
```bash
pip install bitsandbytes
72
73
```

74
### Compile from Source[[cuda-compile]]
75
76

> [!TIP]
77
> Don't hesitate to compile from source! The process is pretty straight forward and resilient. This might be needed for older CUDA Toolkit versions or Linux distributions, or other less common configurations.
78

79
For Linux and Windows systems, compiling from source allows you to customize the build configurations. See below for detailed platform-specific instructions (see the `CMakeLists.txt` if you want to check the specifics and explore some additional options):
80
81
82
83

<hfoptions id="source">
<hfoption id="Linux">

84
To compile from source, you need CMake >= **3.22.1** and Python >= **3.9** installed. Make sure you have a compiler installed to compile C++ (`gcc`, `make`, headers, etc.). It is recommended to use GCC 9 or newer.
85
86

For example, to install a compiler and CMake on Ubuntu:
Younes Belkada's avatar
Younes Belkada committed
87

Steven Liu's avatar
Steven Liu committed
88
89
90
91
```bash
apt-get install -y build-essential cmake
```

92
You should also install CUDA Toolkit by following the [NVIDIA CUDA Installation Guide for Linux](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html) guide. The current minimum supported CUDA Toolkit version that we support is **11.8**.
93

Younes Belkada's avatar
Younes Belkada committed
94
```bash
95
git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/
96
97
cmake -DCOMPUTE_BACKEND=cuda -S .
make
98
pip install -e .   # `-e` for "editable" install, when developing BNB (otherwise leave that out)
Younes Belkada's avatar
Younes Belkada committed
99
```
Steven Liu's avatar
Steven Liu committed
100
101

> [!TIP]
102
> If you have multiple versions of the CUDA Toolkit installed or it is in a non-standard location, please refer to CMake CUDA documentation for how to configure the CUDA compiler.
Younes Belkada's avatar
Younes Belkada committed
103
104
105
106

</hfoption>
<hfoption id="Windows">

107
Compilation from source on Windows systems require Visual Studio with C++ support as well as an installation of the CUDA Toolkit.
108

109
To compile from source, you need CMake >= **3.22.1** and Python >= **3.9** installed. You should also install CUDA Toolkit by following the [CUDA Installation Guide for Windows](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html) guide from NVIDIA. The current minimum supported CUDA Toolkit version that we support is **11.8**.
Younes Belkada's avatar
Younes Belkada committed
110
111

```bash
112
git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/
113
114
cmake -DCOMPUTE_BACKEND=cuda -S .
cmake --build . --config Release
115
pip install -e .   # `-e` for "editable" install, when developing BNB (otherwise leave that out)
Younes Belkada's avatar
Younes Belkada committed
116
117
```

118
Big thanks to [wkpark](https://github.com/wkpark), [Jamezo97](https://github.com/Jamezo97), [rickardp](https://github.com/rickardp), [akx](https://github.com/akx) for their amazing contributions to make bitsandbytes compatible with Windows.
Younes Belkada's avatar
Younes Belkada committed
119

Titus's avatar
Titus committed
120
</hfoption>
Younes Belkada's avatar
Younes Belkada committed
121
</hfoptions>
Steven Liu's avatar
Steven Liu committed
122

123
## Intel XPU[[xpu]]
Steven Liu's avatar
Steven Liu committed
124

125
* A compatible PyTorch version with Intel XPU support is required. The current minimum is **PyTorch 2.6.0**. It is recommended to use the latest stable release. See [Getting Started on Intel GPU](https://docs.pytorch.org/docs/stable/notes/get_start_xpu.html) for guidance.
Steven Liu's avatar
Steven Liu committed
126

127
### Installation via PyPI[[xpu-pip]]
Steven Liu's avatar
Steven Liu committed
128

129
This is the most straightforward and recommended installation option.
Steven Liu's avatar
Steven Liu committed
130

131
The currently distributed `bitsandbytes` packages are built with the following configurations:
Steven Liu's avatar
Steven Liu committed
132

133
134
135
136
| **OS**             | **oneAPI Toolkit** | **Kernel Implementation** |
|--------------------|------------------|----------------------|
| **Linux x86-64**   | 2025.1.3         | SYCL + Triton        |
| **Windows x86-64** | N/A              | SYCL |
Steven Liu's avatar
Steven Liu committed
137

138
139
140
The Linux build has a minimum glibc version of 2.34.

Use `pip` or `uv` to install the latest release:
Steven Liu's avatar
Steven Liu committed
141
142

```bash
143
pip install bitsandbytes
Steven Liu's avatar
Steven Liu committed
144
```
145

146
## Intel Gaudi[[gaudi]]
147

148
* A compatible PyTorch version with Intel Gaudi support is required. The current minimum is **Gaudi v1.21** with **PyTorch 2.6.0**. It is recommended to use the latest stable release. See the Gaudi software [installation guide](https://docs.habana.ai/en/latest/Installation_Guide/index.html) for guidance.
jiqing-feng's avatar
jiqing-feng committed
149
150


151
### Installation from PyPI[[gaudi-pip]]
152

153
Use `pip` or `uv` to install the latest release:
154

155
```bash
156
pip install bitsandbytes
157
```
158

159
## CPU[[cpu]]
jiqing-feng's avatar
jiqing-feng committed
160

161
### Installation from PyPI[[cpu-pip]]
jiqing-feng's avatar
jiqing-feng committed
162

163
This is the most straightforward and recommended installation option.
jiqing-feng's avatar
jiqing-feng committed
164

165
The currently distributed `bitsandbytes` packages are built with the following configurations:
jiqing-feng's avatar
jiqing-feng committed
166

167
168
169
170
171
| **OS**             | **Host Compiler**    | Hardware Minimum
|--------------------|----------------------|----------------------|
| **Linux x86-64**   | GCC 11.4             | AVX2                 |
| **Linux aarch64**  | GCC 11.4             |                      |
| **Windows x86-64** | MSVC 19.43+ (VS2022) | AVX2                 |
172

173
The Linux build has a minimum glibc version of 2.24.
174

175
Use `pip` or `uv` to install the latest release:
176

177
178
179
```bash
pip install bitsandbytes
```
180

181
### Compile from Source[[cpu-compile]]
182

183
To compile from source, simply install the package from source using `pip`. The package will be built for CPU only at this time.
184

185
186
187
188
```bash
git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/
pip install -e .
```
189

190
## AMD ROCm (Preview)[[rocm]]
191

192
193
* A compatible PyTorch version with AMD ROCm support is required. It is recommended to use the latest stable release. See [PyTorch on ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/3rd-party/pytorch-install.html) for guidance.
* ROCm support is currently only available in our preview wheels or when building from source.
194

195
### Preview Wheels from `main`[[rocm-preview]]
196

197
The currently distributed preview `bitsandbytes` are built with the following configurations:
198

199
200
201
202
203
204
205
| **OS**             | **ROCm** | **Targets**
|--------------------|----------|---------------------------|
| **Linux x86-64**   | 6.1.2    | gfx90a / gfx942 / gfx1100
| **Linux x86-64**   | 6.2.4    | gfx90a / gfx942 / gfx1100
| **Linux x86-64**   | 6.3.4    | gfx90a / gfx942 / gfx1100
| **Linux x86-64**   | 6.4.4    | gfx90a / gfx942 / gfx1100
| **Linux x86-64**   | 7.0.0    | gfx90a / gfx942 / gfx1100
206

207
**Windows is not currently supported.**
208

209
Please see [Preview Wheels](#preview-wheels) for installation instructions.
210

211
### Compile from Source[[rocm-compile]]
212

213
bitsandbytes can be compiled from ROCm 6.1 - ROCm 7.0.
214
215

```bash
216
# Install bitsandbytes from source
217
218
# Clone bitsandbytes repo
git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/
jiqing-feng's avatar
jiqing-feng committed
219

220
221
222
# Compile & install
apt-get install -y build-essential cmake  # install build tools dependencies, unless present
cmake -DCOMPUTE_BACKEND=hip -S .  # Use -DBNB_ROCM_ARCH="gfx90a;gfx942" to target specific gpu arch
jiqing-feng's avatar
jiqing-feng committed
223
make
224
pip install -e .   # `-e` for "editable" install, when developing BNB (otherwise leave that out)
jiqing-feng's avatar
jiqing-feng committed
225
226
```

227
## Preview Wheels[[preview-wheels]]
jiqing-feng's avatar
jiqing-feng committed
228

229
If you would like to use new features even before they are officially released and help us test them, feel free to install the wheel directly from our CI (*the wheel links will remain stable!*):
jiqing-feng's avatar
jiqing-feng committed
230

231
232
<hfoptions id="OS">
<hfoption id="Linux">
233

234
235
```bash
# Note: if you don't want to reinstall our dependencies, append the `--no-deps` flag!
236

237
238
# x86_64 (most users)
pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl
239

240
241
242
# ARM/aarch64
pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_aarch64.whl
```
243

244
245
</hfoption>
<hfoption id="Windows">
246

247
```bash
248
249
# Note: if you don't want to reinstall our dependencies, append the `--no-deps` flag!
pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl
250
```
jiqing-feng's avatar
jiqing-feng committed
251
252
</hfoption>
</hfoptions>