Commit 35cf5093 authored by fengzch-das's avatar fengzch-das
Browse files

update readme

parent e938c184
![header ](imgs/of_banner.png)
_Figure: Comparison of OpenFold and AlphaFold2 predictions to the experimental structure of PDB 7KDX, chain B._
# <div align="center"><strong>OpenFold</strong></div>
# OpenFold
## 1 简介
A faithful but trainable PyTorch reproduction of DeepMind's
[AlphaFold 2](https://github.com/deepmind/alphafold).
OpenFold是蛋白质结构预测模型训练和推理的库,主要提供以下功能:蛋白质单体和多体的训练和推理。
# Documentation
See our new home for docs at [openfold.readthedocs.io](https://openfold.readthedocs.io/en/latest/), with instructions for installation and model inference/training.
## 2 安装
Much of the content from this page may be found [here.](https://github.com/aqlaboratory/openfold/blob/main/docs/source/original_readme.md)
组件支持组合
## Copyright Notice
| PyTorch版本 | fastpt版本 | openflod 版本 | DTK版本 | Python版本 | 推荐编译方式 |
| ----------- | ---------- | ------------- | -------- | --------------- | ------------ |
| 2.5.1 | 2.1.0 | v2.2.0-fastpt | >= 25.04 | 3.8、3.10、3.11 | fastpt不转码 |
| 2.4.1 | 2.0.1 | v2.2.0-fastpt | >= 25.04 | 3.8、3.10、3.11 | fastpt不转码 |
While AlphaFold's and, by extension, OpenFold's source code is licensed under
the permissive Apache Licence, Version 2.0, DeepMind's pretrained parameters
fall under the CC BY 4.0 license, a copy of which is downloaded to
`openfold/resources/params` by the installation script. Note that the latter
replaces the original, more restrictive CC BY-NC 4.0 license as of January 2022.
+ pytorch 版本大于 2.4.1 && dtk 版本大于 25.04 推荐使用 fastpt 不转码编译。
## Contributing
### 2.1 使用pip方式安装
If you encounter problems using OpenFold, feel free to create an issue! We also
welcome pull requests from the community.
openflod whl 包下载目录:[光和开发者社区](https://download.sourcefind.cn:65024/4/main),选择对应的 pytorch 版本和 python 版本下载对应 openflod 的 whl 包:
## Citing this Work
```shell
pip install torch* # 下载torch的whl包
pip install fastpt* --no-deps # 下载fastpt的whl包
source /usr/local/bin/fastpt -E
pip install openfold* # 下载的openfold-fastpt的whl包
```
### 2.2 使用源码编译方式安装
#### 2.2.1 编译环境准备
Please cite our paper:
提供基于 fastpt 不转码编译:
```bibtex
@article {Ahdritz2022.11.20.517210,
author = {Ahdritz, Gustaf and Bouatta, Nazim and Floristean, Christina and Kadyan, Sachin and Xia, Qinghui and Gerecke, William and O{\textquoteright}Donnell, Timothy J and Berenberg, Daniel and Fisk, Ian and Zanichelli, Niccolò and Zhang, Bo and Nowaczynski, Arkadiusz and Wang, Bei and Stepniewska-Dziubinska, Marta M and Zhang, Shang and Ojewole, Adegoke and Guney, Murat Efe and Biderman, Stella and Watkins, Andrew M and Ra, Stephen and Lorenzo, Pablo Ribalta and Nivon, Lucas and Weitzner, Brian and Ban, Yih-En Andrew and Sorger, Peter K and Mostaque, Emad and Zhang, Zhao and Bonneau, Richard and AlQuraishi, Mohammed},
title = {{O}pen{F}old: {R}etraining {A}lpha{F}old2 yields new insights into its learning mechanisms and capacity for generalization},
elocation-id = {2022.11.20.517210},
year = {2022},
doi = {10.1101/2022.11.20.517210},
publisher = {Cold Spring Harbor Laboratory},
URL = {https://www.biorxiv.org/content/10.1101/2022.11.20.517210},
eprint = {https://www.biorxiv.org/content/early/2022/11/22/2022.11.20.517210.full.pdf},
journal = {bioRxiv}
}
1. 基于光源 pytorch 基础镜像环境:镜像下载地址:[光合开发者社区](https://sourcefind.cn/#/image/dcu/pytorch),根据 pytorch、python、dtk 及系统下载对应的镜像版本。
2. 基于现有 python 环境:安装 pytorch,fastpt whl 包下载目录:[光合开发者社区](https://sourcefind.cn/#/image/dcu/pytorch),根据 python、dtk 版本,下载对应 pytorch 的 whl 包。安装命令如下:
```shell
pip install torch* # 下载torch的whl包
pip install fastpt* --no-deps # 下载fastpt的whl包, 安装顺序,先安装torch,后安装fastpt
pip install setuptools==59.5.0 wheel pytest biopython dm-tree==0.1.6 ml-collections modelcif
```
If you use OpenProteinSet, please also cite:
```bibtex
@misc{ahdritz2023openproteinset,
title={{O}pen{P}rotein{S}et: {T}raining data for structural biology at scale},
author={Gustaf Ahdritz and Nazim Bouatta and Sachin Kadyan and Lukas Jarosch and Daniel Berenberg and Ian Fisk and Andrew M. Watkins and Stephen Ra and Richard Bonneau and Mohammed AlQuraishi},
year={2023},
eprint={2308.05326},
archivePrefix={arXiv},
primaryClass={q-bio.BM}
}
#### 2.2.2 源码编译安装
- 代码下载
```shell
git clone http://developer.sourcefind.cn/codes/OpenDAS/openfold.git # 根据编译需要切换分支
```
- 提供2种源码编译方式(进入 openfold 目录):
- 源码编译安装:
```bash
source /usr/local/bin/fastpt -C
python setup.py build
python setup.py install
```
- whl 包构建安装:
```bash
source /usr/local/bin/fastpt -C
python setup.py bdist_wheel # 该指令用于编译 whl 包
pip install dist/openfold-2.2.0-cp310-cp310-linux_x86_64.whl
```
#### 2.2.3 注意事项
+ ROCM_PATH 为 dtk 的路径,默认为 /opt/dtk;
+ 在 pytorch2.5.1 环境下编译需要支持 c++17 语法,打开setup.py文件,把文件中的 -std=c++14 修改为 -std=c++17。
## 3 验证
- 执行下面的命令测试组件:
```bash
source /usr/local/bin/fastpt -E
pytest-vs ./tests
```
Any work that cites OpenFold should also cite [AlphaFold](https://www.nature.com/articles/s41586-021-03819-2) and [AlphaFold-Multimer](https://www.biorxiv.org/content/10.1101/2021.10.04.463034v1) if applicable.
## 4 Known Issue
-
## 5 参考资料
- [README_ORIGIN](README_ORIGIN.md)
- [https://github.com/aqlaboratory/openfold.git](https://github.com/aqlaboratory/openfold.git)
![header ](imgs/of_banner.png)
_Figure: Comparison of OpenFold and AlphaFold2 predictions to the experimental structure of PDB 7KDX, chain B._
# OpenFold
A faithful but trainable PyTorch reproduction of DeepMind's
[AlphaFold 2](https://github.com/deepmind/alphafold).
# Documentation
See our new home for docs at [openfold.readthedocs.io](https://openfold.readthedocs.io/en/latest/), with instructions for installation and model inference/training.
Much of the content from this page may be found [here.](https://github.com/aqlaboratory/openfold/blob/main/docs/source/original_readme.md)
## Copyright Notice
While AlphaFold's and, by extension, OpenFold's source code is licensed under
the permissive Apache Licence, Version 2.0, DeepMind's pretrained parameters
fall under the CC BY 4.0 license, a copy of which is downloaded to
`openfold/resources/params` by the installation script. Note that the latter
replaces the original, more restrictive CC BY-NC 4.0 license as of January 2022.
## Contributing
If you encounter problems using OpenFold, feel free to create an issue! We also
welcome pull requests from the community.
## Citing this Work
Please cite our paper:
```bibtex
@article {Ahdritz2022.11.20.517210,
author = {Ahdritz, Gustaf and Bouatta, Nazim and Floristean, Christina and Kadyan, Sachin and Xia, Qinghui and Gerecke, William and O{\textquoteright}Donnell, Timothy J and Berenberg, Daniel and Fisk, Ian and Zanichelli, Niccolò and Zhang, Bo and Nowaczynski, Arkadiusz and Wang, Bei and Stepniewska-Dziubinska, Marta M and Zhang, Shang and Ojewole, Adegoke and Guney, Murat Efe and Biderman, Stella and Watkins, Andrew M and Ra, Stephen and Lorenzo, Pablo Ribalta and Nivon, Lucas and Weitzner, Brian and Ban, Yih-En Andrew and Sorger, Peter K and Mostaque, Emad and Zhang, Zhao and Bonneau, Richard and AlQuraishi, Mohammed},
title = {{O}pen{F}old: {R}etraining {A}lpha{F}old2 yields new insights into its learning mechanisms and capacity for generalization},
elocation-id = {2022.11.20.517210},
year = {2022},
doi = {10.1101/2022.11.20.517210},
publisher = {Cold Spring Harbor Laboratory},
URL = {https://www.biorxiv.org/content/10.1101/2022.11.20.517210},
eprint = {https://www.biorxiv.org/content/early/2022/11/22/2022.11.20.517210.full.pdf},
journal = {bioRxiv}
}
```
If you use OpenProteinSet, please also cite:
```bibtex
@misc{ahdritz2023openproteinset,
title={{O}pen{P}rotein{S}et: {T}raining data for structural biology at scale},
author={Gustaf Ahdritz and Nazim Bouatta and Sachin Kadyan and Lukas Jarosch and Daniel Berenberg and Ian Fisk and Andrew M. Watkins and Stephen Ra and Richard Bonneau and Mohammed AlQuraishi},
year={2023},
eprint={2308.05326},
archivePrefix={arXiv},
primaryClass={q-bio.BM}
}
```
Any work that cites OpenFold should also cite [AlphaFold](https://www.nature.com/articles/s41586-021-03819-2) and [AlphaFold-Multimer](https://www.biorxiv.org/content/10.1101/2021.10.04.463034v1) if applicable.
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment