README.md 8.96 KB
Newer Older
1
2
![header ](imgs/OpenFold_viz_banner.jpg)

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
3
4
5
6
7
# OpenFold

A faithful PyTorch reproduction of DeepMind's 
[AlphaFold 2](https://github.com/deepmind/alphafold).

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
8
9
10
## Features

OpenFold carefully reproduces (almost) all of the features of the original open
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
11
source inference code (v2.0.1). The sole exception is model ensembling, which 
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
12
fared poorly in DeepMind's own ablation testing and is being phased out in future
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
13
DeepMind experiments. It is omitted here for the sake of reducing clutter. In 
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
14
cases where the *Nature* paper differs from the source, we always defer to the 
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
15
16
17
latter. 

OpenFold is built to support inference with AlphaFold's original JAX weights.
Gustaf's avatar
Gustaf committed
18
Try it out with our [Colab notebook](https://colab.research.google.com/github/aqlaboratory/openfold/blob/main/notebooks/OpenFold.ipynb).
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
19
20

Unlike DeepMind's public code, OpenFold is also trainable. It can be trained 
21
22
23
with [DeepSpeed](https://github.com/microsoft/deepspeed) and with mixed 
precision. `bfloat16` training is not currently supported, but will be in the 
future.
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
24

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
25
## Installation (Linux)
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
26

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
27
Python dependencies available through `pip` are provided in `requirements.txt`. 
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
28
OpenFold depends on `openmm==7.5.1` and `pdbfixer`, which are only available 
29
30
31
via `conda`. For producing sequence alignments, you'll also need
`kalign`, the [HH-suite](https://github.com/soedinglab/hh-suite), and one of 
{`jackhmmer`, [MMseqs2](https://github.com/soedinglab/mmseqs2)} installed on
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
32
on your system. Finally, some download scripts require `aria2c`.
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
33

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
34
For convenience, we provide a script that installs Miniconda locally, creates a 
35
36
`conda` virtual environment, installs all Python dependencies, and downloads
useful resources (including DeepMind's pretrained parameters). Run:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
37
38

```bash
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
39
40
41
scripts/install_third_party_dependencies.sh
```

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
42
To activate the environment, run:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
43
44

```bash
sft-managed's avatar
sft-managed committed
45
source scripts/activate_conda_env.sh
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
46
47
```

48
To deactivate it, run:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
49
50

```bash
sft-managed's avatar
sft-managed committed
51
source scripts/deactivate_conda_env.sh
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
52
53
```

54
55
56
57
58
59
To install the HH-suite to `/usr/bin`, run

```bash
# scripts/install_hh_suite.sh
```

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
60
## Usage
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
61

Gustaf's avatar
Gustaf committed
62
To download DeepMind's pretrained parameters and common ground truth data, run:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
63
64

```bash
Gustaf's avatar
Gustaf committed
65
scripts/download_data.sh data/
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
66
67
```

Gustaf's avatar
Gustaf committed
68
69
70
You have two choices for downloading protein databases, depending on whether 
you want to use DeepMind's MSA generation pipeline (w/ HMMR & HHblits) or 
[ColabFold](https://github.com/sokrypton/ColabFold)'s, which uses the faster
71
MMseqs2 instead. For the former, run:
Gustaf's avatar
Gustaf committed
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87

```bash
scripts/download_alphafold_databases.sh data/
```

For the latter, run:

```bash
scripts/download_mmseqs_databases.sh data/    # downloads .tar files
scripts/prep_mmseqs_databases.sh data/        # unpacks and preps the databases
```

Make sure to run the latter command on the machine that will be used for MSA
generation (the script estimates how the precomputed database index used by
MMseqs2 should be split according to the memory available on the system).

88
89
90
91
Alternatively, you can use raw MSAs from 
[ProteinNet](https://github.com/aqlaboratory/proteinnet). After downloading
the database, use `scripts/prepare_proteinnet_msas.py` to convert the data into
a format recognized by the OpenFold parser. The resulting directory becomes the
92
93
`alignment_dir` used in subsequent steps. Use `scripts/unpack_proteinnet.py` to
extract `.core` files from ProteinNet text files.
94

95
96
97
For both inference and training, the model's hyperparameters can be tuned from
`openfold/config.py`. Of course, if you plan to perform inference using 
DeepMind's pretrained parameters, you will only be able to make changes that
98
99
do not affect the shapes of model parameters. For an example of initializing
the model, consult `run_pretrained_openfold.py`.
100

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
101
### Inference
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
102

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
103
104
To run inference on a sequence or multiple sequences using a set of DeepMind's 
pretrained parameters, run e.g.:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
105

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
106
```bash
107
python3 run_pretrained_openfold.py \
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
108
    target.fasta \
109
110
111
112
113
114
115
    data/uniref90/uniref90.fasta \
    data/mgnify/mgy_clusters_2018_12.fa \
    data/pdb70/pdb70 \
    data/pdb_mmcif/mmcif_files/ \
    data/uniclust30/uniclust30_2018_08/uniclust30_2018_08 \
    --output_dir ./ \
    --bfd_database_path data/bfd/bfd_metaclust_clu_complete_id30_c90_final_seq.sorted_opt \
116
    --model_device cuda:1 \
sft-managed's avatar
sft-managed committed
117
118
119
120
    --jackhmmer_binary_path lib/conda/envs/openfold_venv/bin/jackhmmer \
    --hhblits_binary_path lib/conda/envs/openfold_venv/bin/hhblits \
    --hhsearch_binary_path lib/conda/envs/openfold_venv/bin/hhsearch \
    --kalign_binary_path lib/conda/envs/openfold_venv/bin/kalign
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
121
```
122

Gustaf's avatar
Gustaf committed
123
124
125
where `data` is the same directory as in the previous step. If `jackhmmer`, 
`hhblits`, `hhsearch` and `kalign` are available at the default path of 
`/usr/bin`, their `binary_path` command-line arguments can be dropped.
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
126
127
If you've already computed alignments for the query, you have the option to 
circumvent the expensive alignment computation here.
128

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
129
### Training
130

Gustaf's avatar
Gustaf committed
131
132
After activating the OpenFold environment with 
`source scripts/activate_conda_env.sh`, install OpenFold by running
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
133

sft-managed's avatar
sft-managed committed
134
135
136
```bash
python setup.py install
```
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
137

Gustaf's avatar
Gustaf committed
138
139
140
141
To train the model, you will first need to precompute protein alignments. 

You have two options. You can use the same procedure DeepMind used by running
the following:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
142
143
144
145
146
147
148
149
150

```bash
python3 scripts/precompute_alignments.py mmcif_dir/ alignment_dir/ \
    data/uniref90/uniref90.fasta \
    data/mgnify/mgy_clusters_2018_12.fa \
    data/pdb70/pdb70 \
    data/pdb_mmcif/mmcif_files/ \
    data/uniclust30/uniclust30_2018_08/uniclust30_2018_08 \
    --bfd_database_path data/bfd/bfd_metaclust_clu_complete_id30_c90_final_seq.sorted_opt \
sft-managed's avatar
sft-managed committed
151
152
153
154
155
    --cpus 16 \
    --jackhmmer_binary_path lib/conda/envs/openfold_venv/bin/jackhmmer \
    --hhblits_binary_path lib/conda/envs/openfold_venv/bin/hhblits \
    --hhsearch_binary_path lib/conda/envs/openfold_venv/bin/hhsearch \
    --kalign_binary_path lib/conda/envs/openfold_venv/bin/kalign
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
156
```
Gustaf's avatar
Gustaf committed
157

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
158
159
160
As noted before, you can skip the `binary_path` arguments if these binaries are 
at `/usr/bin`. Expect this step to take a very long time, even for small 
numbers of proteins.
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
161

Gustaf's avatar
Gustaf committed
162
163
164
165
166
167
168
Alternatively, you can generate MSAs with the ColabFold pipeline (and templates
with HHsearch) with:

```bash
python3 scripts/precompute_alignments_mmseqs.py input.fasta \
    data/mmseqs_dbs \
    uniref30_2103_db \
Gustaf's avatar
Gustaf committed
169
    alignment_dir \
Gustaf's avatar
Gustaf committed
170
171
172
173
174
175
176
    ~/MMseqs2/build/bin/mmseqs \
    /usr/bin/hhsearch \
    --env_db colabfold_envdb_202108_db
    --pdb70 data/pdb70/pdb70
```

where `input.fasta` is a FASTA file containing one or more query sequences. To 
Gustaf's avatar
Gustaf committed
177
178
generate an input FASTA from a directory of mmCIF and/or ProteinNet .core 
files, we provide `scripts/data_dir_to_fasta.py`.
Gustaf's avatar
Gustaf committed
179

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
180
Next, generate a cache of certain datapoints in the mmCIF files:
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
181
182

```bash
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
183
184
185
186
python3 scripts/generate_mmcif_cache.py \
    mmcif_dir/ \
    mmcif_cache.json \
    --no_workers 16
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
187
188
189
190
191
192
193
194
195
196
197
198
```

This cache is used to minimize the number of mmCIF parses performed during 
training-time data preprocessing. Finally, call the training script:

```bash
python3 train_openfold.py mmcif_dir/ alignment_dir/ template_mmcif_dir/ \
    2021-10-10 \ 
    --template_release_dates_cache_path mmcif_cache.json \ 
    --precision 16 \
    --gpus 8 --replace_sampler_ddp=True \
    --seed 42 \ # in multi-gpu settings, the seed must be specified
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
199
200
    --deepspeed_config_path deepspeed_config.json \
    --resume_from_ckpt ckpt_dir/
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
201
202
203
204
205
206
207
208
209
210
```

where `--template_release_dates_cache_path` is a path to the `.json` file
generated in the previous step. A suitable DeepSpeed configuration file can be 
generated with `scripts/build_deepspeed_config.py`. The training script is 
written with [PyTorch Lightning](https://github.com/PyTorchLightning/pytorch-lightning) 
and supports the full range of training options that entails, including 
multi-node distributed training. For more information, consult PyTorch 
Lightning documentation and the `--help` flag of the training script.

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
211
212
213
214
Note that the data directory can also contain PDB files previously output by
the model. These are treated as members of the self-distillation set and are
subjected to distillation-set-only preprocessing steps.

215
216
217
218
219
220
221
222
223
## Testing

To run unit tests, use

```bash
scripts/run_unit_tests.sh
```

The script is a thin wrapper around Python's `unittest` suite, and recognizes
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
224
`unittest` arguments. E.g., to run a specific test verbosely:
225
226
227
228
229

```bash
scripts/run_unit_tests.sh -v tests.test_model
```

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
230
Certain tests require that AlphaFold (v2.0.1) be installed in the same Python
231
232
environment. These run components of AlphaFold and OpenFold side by side and
ensure that output activations are adequately similar. For most modules, we
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
233
target a maximum pointwise difference of `1e-4`.
234

235
236
237
238
239
240
241
## Copyright notice

While AlphaFold's and, by extension, OpenFold's source code is licensed under
the permissive Apache Licence, Version 2.0, DeepMind's pretrained parameters 
remain under the more restrictive CC BY-NC 4.0 license, a copy of which is 
downloaded to `openfold/resources/params` by the installation script. They are
thereby made unavailable for commercial use.
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
242
243
244

## Contributing

Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
245
246
If you encounter problems using OpenFold, feel free to create an issue! We also
welcome pull requests from the community.
Gustaf Ahdritz's avatar
Gustaf Ahdritz committed
247
248
249

## Citing this work

250
Stay tuned for an OpenFold DOI.