build.md 2.61 KB
Newer Older
1
# Build from source
2

3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
LMDeploy provides prebuilt package that can be easily installed by `pip install lmdeploy`.

If you have requests to build lmdeploy from source, please clone lmdeploy repository from GitHub, and follow instructions in next sections

```shell
git clone --depth=1 https://github.com/InternLM/lmdeploy
```

## Build in Docker (recommended)

We highly advise using the provided docker image for lmdeploy build to circumvent complex environment setup.

The docker image is `openmmlab/lmdeploy-builder:cuda11.8`. Make sure that docker is installed before using this image.

In the root directory of the lmdeploy source code, please run the following command:

```shell
cd lmdeploy # the home folder of lmdeploy source code
bash builder/manywheel/build_all_wheel.sh
```

All the wheel files for lmdeploy under py3.8 - py3.11 will be found in the `builder/manywheel/cuda11.8_dist` directory, such as,

```text
builder/manywheel/cuda11.8_dist/
├── lmdeploy-0.0.12-cp310-cp310-manylinux2014_x86_64.whl
├── lmdeploy-0.0.12-cp311-cp311-manylinux2014_x86_64.whl
├── lmdeploy-0.0.12-cp38-cp38-manylinux2014_x86_64.whl
└── lmdeploy-0.0.12-cp39-cp39-manylinux2014_x86_64.whl
```

If the wheel file for a specific Python version is required, such as py3.8, please execute:

```shell
bash builder/manywheel/build_wheel.sh py38 manylinux2014_x86_64 cuda11.8 cuda11.8_dist
```

And the wheel file will be found in the `builder/manywheel/cuda11.8_dist` directory.

You can use `pip install` to install the wheel file that matches the Python version on your host machine.

## Build in localhost (optional)

Firstly, please make sure gcc version is no less than 9, which can be conformed by `gcc --version`.

Then, follow the steps below to set up the compilation environment:

- install the dependent packages:
51
52
  ```shell
  pip install -r requirements.txt
53
  apt-get install rapidjson-dev
54
  ```
55
- install [nccl](https://docs.nvidia.com/deeplearning/nccl/install-guide/index.html), and set environment variables:
56
57
58
59
  ```shell
  export NCCL_ROOT_DIR=/path/to/nccl/build
  export NCCL_LIBRARIES=/path/to/nccl/build/lib
  ```
60
- install openmpi from source:
61
62
  ```shell
  wget https://download.open-mpi.org/release/open-mpi/v4.1/openmpi-4.1.5.tar.gz
63
64
65
66
  tar xf openmpi-4.1.5.tar.gz
  cd openmpi-4.1.5
  ./configure
  make -j$(nproc) && make install
67
  ```
68
- build and install lmdeploy libraries:
69
  ```shell
70
  apt install ninja-build # install ninja
71
  cd lmdeploy # the home folder of lmdeploy
72
73
  mkdir build && cd build
  sh ../generate.sh
74
  ninja -j$(nproc) && ninja install
75
  ```
76
77
78
79
80
- install lmdeploy python package:
  ```shell
  cd ..
  pip install -e .
  ```