installation.qmd 3.67 KB
Newer Older
chenzk's avatar
v1.0  
chenzk committed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
---
title: "Installation"
format:
  html:
    toc: true
    toc-depth: 3
    number-sections: true
execute:
  enabled: false
---

This guide covers all the ways you can install and set up Axolotl for your environment.

## Requirements {#sec-requirements}

- NVIDIA GPU (Ampere architecture or newer for `bf16` and Flash Attention) or AMD GPU
- Python ≥3.10
- PyTorch ≥2.4.1

## Installation Methods {#sec-installation-methods}

::: {.callout-important}
Please make sure to have Pytorch installed before installing Axolotl in your local environment.

Follow the instructions at: [https://pytorch.org/get-started/locally/](https://pytorch.org/get-started/locally/)
:::

::: {.callout-important}
For Blackwell GPUs, please use Pytorch 2.7.0 and CUDA 12.8.
:::

### PyPI Installation (Recommended) {#sec-pypi}

```{.bash}
pip3 install -U packaging setuptools wheel ninja
pip3 install --no-build-isolation axolotl[flash-attn,deepspeed]
```

We use `--no-build-isolation` in order to detect the installed PyTorch version (if
installed) in order not to clobber it, and so that we set the correct version of
dependencies that are specific to the PyTorch version or other installed
co-dependencies.

### Edge/Development Build {#sec-edge-build}

For the latest features between releases:

```{.bash}
git clone https://github.com/axolotl-ai-cloud/axolotl.git
cd axolotl
pip3 install -U packaging setuptools wheel ninja
pip3 install --no-build-isolation -e '.[flash-attn,deepspeed]'
```

### Docker {#sec-docker}

```{.bash}
docker run --gpus '"all"' --rm -it axolotlai/axolotl:main-latest
```

For development with Docker:

```{.bash}
docker compose up -d
```

::: {.callout-tip}
### Advanced Docker Configuration
```{.bash}
docker run --privileged --gpus '"all"' --shm-size 10g --rm -it \
  --name axolotl --ipc=host \
  --ulimit memlock=-1 --ulimit stack=67108864 \
  --mount type=bind,src="${PWD}",target=/workspace/axolotl \
  -v ${HOME}/.cache/huggingface:/root/.cache/huggingface \
  axolotlai/axolotl:main-latest
```
:::

::: {.callout-important}
For Blackwell GPUs, please use `axolotlai/axolotl:main-py3.11-cu128-2.7.0` or the cloud variant `axolotlai/axolotl-cloud:main-py3.11-cu128-2.7.0`.
:::

Please refer to the [Docker documentation](docker.qmd) for more information on the different Docker images that are available.

## Cloud Environments {#sec-cloud}

### Cloud GPU Providers {#sec-cloud-gpu}

For providers supporting Docker:

- Use `axolotlai/axolotl-cloud:main-latest`
- Available on:
  - [Latitude.sh](https://latitude.sh/blueprint/989e0e79-3bf6-41ea-a46b-1f246e309d5c)
  - [JarvisLabs.ai](https://jarvislabs.ai/templates/axolotl)
  - [RunPod](https://runpod.io/gsc?template=v2ickqhz9s&ref=6i7fkpdz)
  - [Novita](https://novita.ai/gpus-console?templateId=311)

### Google Colab {#sec-colab}

Use our [example notebook](../examples/colab-notebooks/colab-axolotl-example.ipynb).

## Platform-Specific Instructions {#sec-platform-specific}

### macOS {#sec-macos}

```{.bash}
pip3 install --no-build-isolation -e '.'
```

See @sec-troubleshooting for Mac-specific issues.

### Windows {#sec-windows}

::: {.callout-important}
We recommend using WSL2 (Windows Subsystem for Linux) or Docker.
:::

## Environment Managers {#sec-env-managers}

### Conda/Pip venv {#sec-conda}

1. Install Python ≥3.10
2. Install PyTorch: https://pytorch.org/get-started/locally/
3. Install Axolotl:
   ```{.bash}
   pip3 install -U packaging setuptools wheel ninja
   pip3 install --no-build-isolation -e '.[flash-attn,deepspeed]'
   ```
4. (Optional) Login to Hugging Face:
   ```{.bash}
   huggingface-cli login
   ```

## Troubleshooting {#sec-troubleshooting}

If you encounter installation issues, see our [FAQ](faq.qmd) and [Debugging Guide](debugging.qmd).