Commit 90ce2be9 authored by Gustaf Ahdritz's avatar Gustaf Ahdritz
Browse files

Describe bfloat16 support in README

parent 29be56e5
......@@ -18,9 +18,8 @@ OpenFold is built to support inference with AlphaFold's original JAX weights.
Try it out with our [Colab notebook](https://colab.research.google.com/github/aqlaboratory/openfold/blob/main/notebooks/OpenFold.ipynb).
Unlike DeepMind's public code, OpenFold is also trainable. It can be trained
with [DeepSpeed](https://github.com/microsoft/deepspeed) and with mixed
precision. `bfloat16` training is not currently supported, but will be in the
future.
with [DeepSpeed](https://github.com/microsoft/deepspeed) and with both `fp16`
and `bfloat16` half-precision.
## Installation (Linux)
......@@ -208,6 +207,10 @@ and supports the full range of training options that entails, including
multi-node distributed training. For more information, consult PyTorch
Lightning documentation and the `--help` flag of the training script.
Hardware permitting, you can train with `bfloat16` half-precision by passing
`bf16` as the `--precision` option. If you're using DeepSpeed, make sure to
enable `bfloat16` in the DeepSpeed config as well.
Note that the data directory can also contain PDB files previously output by
the model. These are treated as members of the self-distillation set and are
subjected to distillation-set-only preprocessing steps.
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment