@@ -19,7 +19,7 @@ and we've trained it from scratch, matching the performance of the original.
...
@@ -19,7 +19,7 @@ and we've trained it from scratch, matching the performance of the original.
We've publicly released model weights and our training data — some 400,000
We've publicly released model weights and our training data — some 400,000
MSAs and PDB70 template hit files — under a permissive license. Model weights
MSAs and PDB70 template hit files — under a permissive license. Model weights
are available via scripts in this repository while the MSAs are hosted by the
are available via scripts in this repository while the MSAs are hosted by the
[Registry of Open Data on AWS (RODA)](registry.opendata.aws/openfold).
[Registry of Open Data on AWS (RODA)](https://registry.opendata.aws/openfold).
Try out running inference for yourself with our [Colab notebook](https://colab.research.google.com/github/aqlaboratory/openfold/blob/main/notebooks/OpenFold.ipynb).
Try out running inference for yourself with our [Colab notebook](https://colab.research.google.com/github/aqlaboratory/openfold/blob/main/notebooks/OpenFold.ipynb).
OpenFold also supports inference using AlphaFold's official parameters.
OpenFold also supports inference using AlphaFold's official parameters.