Commit a90da395 authored by jnwei's avatar jnwei
Browse files

Test flash-attention v0.2.1 on docker CI

parent 5f5c8f2a
...@@ -21,13 +21,14 @@ dependencies: ...@@ -21,13 +21,14 @@ dependencies:
- wandb==0.12.21 - wandb==0.12.21
- modelcif==0.7 - modelcif==0.7
- awscli - awscli
- ml-collections
- bioconda::aria2 - bioconda::aria2
- bioconda::hmmer==3.3.2 - bioconda::hmmer==3.3.2
- bioconda::hhsuite==3.3.0 - bioconda::hhsuite==3.3.0
- bioconda::kalign2==2.04 - bioconda::kalign2==2.04
- pip: - pip:
- deepspeed==0.5.10 # can this be updated? - deepspeed==0.5.10
- dm-tree==0.1.6 # 0.1.6 yanked from conda-forge - update? - dm-tree==0.1.6
- ml-collections==0.1.0 # 0.1.1 is oldest available on conda-forge - update?
- git+https://github.com/NVIDIA/dllogger.git - git+https://github.com/NVIDIA/dllogger.git
- git+https://github.com/Dao-AILab/flash-attention.git@5b838a8 - git+https://github.com/Dao-AILab/flash-attention.git=0.2.1
# - git+https://github.com/Dao-AILab/flash-attention.git@5b838a8
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment