# Installation Guide Welcome to the installation guide for the `bitsandbytes` library! This document provides step-by-step instructions to install `bitsandbytes` across various platforms and hardware configurations. We provide official support for NVIDIA GPUs, CPUs, Intel XPUs, and Intel Gaudi platforms. We also have experimental support for additional platforms such as AMD ROCm. ## Table of Contents - [System Requirements](#requirements) - [NVIDIA CUDA](#cuda) - [Installation via PyPI](#cuda-pip) - [Compile from Source](#cuda-compile) - [Intel XPU](#xpu) - [Installation via PyPI](#xpu-pip) - [Intel Gaudi](#gaudi) - [Installation via PyPI](#gaudi-pip) - [CPU](#cpu) - [Installation via PyPI](#cpu-pip) - [Compile from Source](#cpu-compile) - [AMD ROCm (Preview)](#rocm-preview) - [Preview Wheels](#preview-wheels) ## System Requirements[[requirements]] These are the minimum requirements for `bitsandbytes` across all platforms. Please be aware that some compute platforms may impose more strict requirements. * Python >= 3.9 * PyTorch >= 2.3 ## NVIDIA CUDA[[cuda]] `bitsandbytes` is currently supported on NVIDIA GPUs with [Compute Capability](https://developer.nvidia.com/cuda-gpus) 6.0+. The library can be built using CUDA Toolkit versions as old as **11.8**. | **Feature** | **CC Required** | **Example Hardware Requirement** | |---------------------------------|-----------------|---------------------------------------------| | LLM.int8() | 7.5+ | Turing (RTX 20 series, T4) or newer GPUs | | 8-bit optimizers/quantization | 6.0+ | Pascal (GTX 10X0 series, P100) or newer GPUs| | NF4/FP4 quantization | 6.0+ | Pascal (GTX 10X0 series, P100) or newer GPUs| > [!WARNING] > Support for Maxwell GPUs is deprecated and will be removed in a future release. > Maxwell support is not included in PyPI distributions from `v0.48.0` on and must be built from source. > For the best results, a Turing generation device or newer is recommended. ### Installation via PyPI[[cuda-pip]] This is the most straightforward and recommended installation option. The currently distributed `bitsandbytes` packages are built with the following configurations: | **OS** | **CUDA Toolkit** | **Host Compiler** | **Targets** |--------------------|------------------|----------------------|-------------- | **Linux x86-64** | 11.8 - 12.6 | GCC 11.2 | sm60, sm70, sm75, sm80, sm86, sm89, sm90 | **Linux x86-64** | 12.8 - 12.9 | GCC 11.2 | sm70, sm75, sm80, sm86, sm89, sm90, sm100, sm120 | **Linux x86-64** | 13.0 | GCC 11.2 | sm75, sm80, sm86, sm89, sm90, sm100, sm110, sm120 | **Linux aarch64** | 11.8 - 12.6 | GCC 11.2 | sm75, sm80, sm90 | **Linux aarch64** | 12.8 - 13.0 | GCC 11.2 | sm75, sm80, sm90, sm100, sm120 | **Windows x86-64** | 11.8 - 12.6 | MSVC 19.43+ (VS2022) | sm50, sm60, sm75, sm80, sm86, sm89, sm90 | **Windows x86-64** | 12.8 - 12.9 | MSVC 19.43+ (VS2022) | sm70, sm75, sm80, sm86, sm89, sm90, sm100, sm120 | **Windows x86-64** | 13.0 | MSVC 19.43+ (VS2022) | sm75, sm80, sm86, sm89, sm90, sm100, sm120 The Linux build has a minimum glibc version of 2.24. Use `pip` or `uv` to install the latest release: ```bash pip install bitsandbytes ``` ### Compile from Source[[cuda-compile]] > [!TIP] > Don't hesitate to compile from source! The process is pretty straight forward and resilient. This might be needed for older CUDA Toolkit versions or Linux distributions, or other less common configurations. For Linux and Windows systems, compiling from source allows you to customize the build configurations. See below for detailed platform-specific instructions (see the `CMakeLists.txt` if you want to check the specifics and explore some additional options): To compile from source, you need CMake >= **3.22.1** and Python >= **3.9** installed. Make sure you have a compiler installed to compile C++ (`gcc`, `make`, headers, etc.). It is recommended to use GCC 9 or newer. For example, to install a compiler and CMake on Ubuntu: ```bash apt-get install -y build-essential cmake ``` You should also install CUDA Toolkit by following the [NVIDIA CUDA Installation Guide for Linux](https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html) guide. The current minimum supported CUDA Toolkit version that we support is **11.8**. ```bash git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/ cmake -DCOMPUTE_BACKEND=cuda -S . make pip install -e . # `-e` for "editable" install, when developing BNB (otherwise leave that out) ``` > [!TIP] > If you have multiple versions of the CUDA Toolkit installed or it is in a non-standard location, please refer to CMake CUDA documentation for how to configure the CUDA compiler. Compilation from source on Windows systems require Visual Studio with C++ support as well as an installation of the CUDA Toolkit. To compile from source, you need CMake >= **3.22.1** and Python >= **3.9** installed. You should also install CUDA Toolkit by following the [CUDA Installation Guide for Windows](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html) guide from NVIDIA. The current minimum supported CUDA Toolkit version that we support is **11.8**. ```bash git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/ cmake -DCOMPUTE_BACKEND=cuda -S . cmake --build . --config Release pip install -e . # `-e` for "editable" install, when developing BNB (otherwise leave that out) ``` Big thanks to [wkpark](https://github.com/wkpark), [Jamezo97](https://github.com/Jamezo97), [rickardp](https://github.com/rickardp), [akx](https://github.com/akx) for their amazing contributions to make bitsandbytes compatible with Windows. ## Intel XPU[[xpu]] * A compatible PyTorch version with Intel XPU support is required. The current minimum is **PyTorch 2.6.0**. It is recommended to use the latest stable release. See [Getting Started on Intel GPU](https://docs.pytorch.org/docs/stable/notes/get_start_xpu.html) for guidance. ### Installation via PyPI[[xpu-pip]] This is the most straightforward and recommended installation option. The currently distributed `bitsandbytes` packages are built with the following configurations: | **OS** | **oneAPI Toolkit** | **Kernel Implementation** | |--------------------|------------------|----------------------| | **Linux x86-64** | 2025.1.3 | SYCL + Triton | | **Windows x86-64** | N/A | SYCL | The Linux build has a minimum glibc version of 2.34. Use `pip` or `uv` to install the latest release: ```bash pip install bitsandbytes ``` ## Intel Gaudi[[gaudi]] * A compatible PyTorch version with Intel Gaudi support is required. The current minimum is **Gaudi v1.21** with **PyTorch 2.6.0**. It is recommended to use the latest stable release. See the Gaudi software [installation guide](https://docs.habana.ai/en/latest/Installation_Guide/index.html) for guidance. ### Installation from PyPI[[gaudi-pip]] Use `pip` or `uv` to install the latest release: ```bash pip install bitsandbytes ``` ## CPU[[cpu]] ### Installation from PyPI[[cpu-pip]] This is the most straightforward and recommended installation option. The currently distributed `bitsandbytes` packages are built with the following configurations: | **OS** | **Host Compiler** | Hardware Minimum |--------------------|----------------------|----------------------| | **Linux x86-64** | GCC 11.4 | AVX2 | | **Linux aarch64** | GCC 11.4 | | | **Windows x86-64** | MSVC 19.43+ (VS2022) | AVX2 | The Linux build has a minimum glibc version of 2.24. Use `pip` or `uv` to install the latest release: ```bash pip install bitsandbytes ``` ### Compile from Source[[cpu-compile]] To compile from source, simply install the package from source using `pip`. The package will be built for CPU only at this time. ```bash git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/ pip install -e . ``` ## AMD ROCm (Preview)[[rocm]] * A compatible PyTorch version with AMD ROCm support is required. It is recommended to use the latest stable release. See [PyTorch on ROCm](https://rocm.docs.amd.com/projects/install-on-linux/en/latest/install/3rd-party/pytorch-install.html) for guidance. * ROCm support is currently only available in our preview wheels or when building from source. ### Preview Wheels from `main`[[rocm-preview]] The currently distributed preview `bitsandbytes` are built with the following configurations: | **OS** | **ROCm** | **Targets** |--------------------|----------|---------------------------| | **Linux x86-64** | 6.1.2 | gfx90a / gfx942 / gfx1100 | **Linux x86-64** | 6.2.4 | gfx90a / gfx942 / gfx1100 | **Linux x86-64** | 6.3.4 | gfx90a / gfx942 / gfx1100 | **Linux x86-64** | 6.4.4 | gfx90a / gfx942 / gfx1100 | **Linux x86-64** | 7.0.0 | gfx90a / gfx942 / gfx1100 **Windows is not currently supported.** Please see [Preview Wheels](#preview-wheels) for installation instructions. ### Compile from Source[[rocm-compile]] bitsandbytes can be compiled from ROCm 6.1 - ROCm 7.0. ```bash # Install bitsandbytes from source # Clone bitsandbytes repo git clone https://github.com/bitsandbytes-foundation/bitsandbytes.git && cd bitsandbytes/ # Compile & install apt-get install -y build-essential cmake # install build tools dependencies, unless present cmake -DCOMPUTE_BACKEND=hip -S . # Use -DBNB_ROCM_ARCH="gfx90a;gfx942" to target specific gpu arch make pip install -e . # `-e` for "editable" install, when developing BNB (otherwise leave that out) ``` ## Preview Wheels[[preview-wheels]] If you would like to use new features even before they are officially released and help us test them, feel free to install the wheel directly from our CI (*the wheel links will remain stable!*): ```bash # Note: if you don't want to reinstall our dependencies, append the `--no-deps` flag! # x86_64 (most users) pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_x86_64.whl # ARM/aarch64 pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-manylinux_2_24_aarch64.whl ``` ```bash # Note: if you don't want to reinstall our dependencies, append the `--no-deps` flag! pip install --force-reinstall https://github.com/bitsandbytes-foundation/bitsandbytes/releases/download/continuous-release_main/bitsandbytes-1.33.7.preview-py3-none-win_amd64.whl ```