Unverified Commit d779bd53 authored by Shaden Smith's avatar Shaden Smith Committed by GitHub
Browse files

Installation documentation updates. (#525)

* Adds torch install requirement to documentation.

* build ops documentation
parent ca9ab120
......@@ -72,11 +72,14 @@ just-in-time (JIT) using [torch's JIT C++ extension loader that relies on
ninja](https://pytorch.org/docs/stable/cpp_extension.html) to build and
dynamically link them at runtime.
**Note:** [PyTorch](https://pytorch.org/) must be installed _before_ installing
DeepSpeed.
```bash
pip install deepspeed
```
After installation you can validate your install and see which extensions/ops
After installation, you can validate your install and see which extensions/ops
your machine is compatible with via the DeepSpeed environment report.
```bash
......
......@@ -11,11 +11,15 @@ just-in-time (JIT) using [torch's JIT C++ extension loader that relies on
ninja](https://pytorch.org/docs/stable/cpp_extension.html) to build and
dynamically link them at runtime.
**Note:** [PyTorch](https://pytorch.org/) must be installed _before_ installing
DeepSpeed.
{: .notice--info}
```bash
pip install deepspeed
```
After installation you can validate your install and see which ops your machine
After installation, you can validate your install and see which ops your machine
is compatible with via the DeepSpeed environment report with `ds_report` or
`python -m deepspeed.env_report`. We've found this report useful when debugging
DeepSpeed install or compatibility issues.
......@@ -24,23 +28,6 @@ DeepSpeed install or compatibility issues.
ds_report
```
## Install DeepSpeed from source
After cloning the DeepSpeed repo from github you can install DeepSpeed in
JIT mode via pip (see below). This install should complete
quickly since it is not compiling any C++/CUDA source files.
```bash
pip install .
```
For installs spanning multiple nodes we find it useful to install DeepSpeed
using the
[install.sh](https://github.com/microsoft/DeepSpeed/blob/master/install.sh)
script in the repo. This will build a python wheel locally and copy it to all
the nodes listed in your hostfile (either given via --hostfile, or defaults to
/job/hostfile).
## Pre-install DeepSpeed Ops
Sometimes we have found it useful to pre-install either some or all DeepSpeed
......@@ -53,23 +40,50 @@ want to attempt to install all of our ops by setting the `DS_BUILD_OPS`
environment variable to 1, for example:
```bash
DS_BUILD_OPS=1 pip install .
DS_BUILD_OPS=1 pip install deepspeed
```
DeepSpeed will only install any ops that are compatible with your machine.
For more details on which ops are compatible with your system please try our
`ds_report` tool described above.
If you want to install only a specific op (e.g., FusedLamb), you can toggle
with `DS_BUILD` environment variables at installation time. For example, to
install DeepSpeed with only the FusedLamb op use:
```bash
DS_BUILD_FUSED_LAMB=1 pip install deepspeed
```
We will only install any ops that are compatible with your machine, for more
details on which ops are compatible with your system please try our `ds_report`
tool described above.
Available `DS_BUILD` options include:
* `DS_BUILD_OPS` toggles all ops
* `DS_BUILD_CPU_ADAM` builds the CPUAdam op
* `DS_BUILD_FUSED_ADAM` builds the FusedAdam op (from [apex](https://github.com/NVIDIA/apex))
* `DS_BUILD_FUSED_LAMB` builds the FusedLamb op
* `DS_BUILD_SPARSE_ATTN` builds the sparse attention op
* `DS_BUILD_TRANSFORMER` builds the transformer op
* `DS_BUILD_STOCHASTIC_TRANSFORMER` builds the stochastic transformer op
* `DS_BUILD_UTILS` builds various optimized utilities
If you want to install only a specific op (e.g., FusedLamb) you can view the op
specific build environment variable (set as `BUILD_VAR`) in the corresponding
op builder class in the
[op\_builder](https://github.com/microsoft/DeepSpeed/tree/master/op_builder)
directory. For example to install only the Fused Lamb op you would install via:
## Install DeepSpeed from source
After cloning the DeepSpeed repo from GitHub, you can install DeepSpeed in
JIT mode via pip (see below). This install should complete
quickly since it is not compiling any C++/CUDA source files.
```bash
DS_BUILD_FUSED_LAMB=1 pip install .
pip install .
```
For installs spanning multiple nodes we find it useful to install DeepSpeed
using the
[install.sh](https://github.com/microsoft/DeepSpeed/blob/master/install.sh)
script in the repo. This will build a python wheel locally and copy it to all
the nodes listed in your hostfile (either given via --hostfile, or defaults to
/job/hostfile).
## Feature specific dependencies
Some DeepSpeed features require specific dependencies outside of the general
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment