Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
OpenFold
Commits
a90da395
Commit
a90da395
authored
Oct 24, 2023
by
jnwei
Browse files
Test flash-attention v0.2.1 on docker CI
parent
5f5c8f2a
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
5 additions
and
4 deletions
+5
-4
environment.yml
environment.yml
+5
-4
No files found.
environment.yml
View file @
a90da395
...
@@ -21,13 +21,14 @@ dependencies:
...
@@ -21,13 +21,14 @@ dependencies:
-
wandb==0.12.21
-
wandb==0.12.21
-
modelcif==0.7
-
modelcif==0.7
-
awscli
-
awscli
-
ml-collections
-
bioconda::aria2
-
bioconda::aria2
-
bioconda::hmmer==3.3.2
-
bioconda::hmmer==3.3.2
-
bioconda::hhsuite==3.3.0
-
bioconda::hhsuite==3.3.0
-
bioconda::kalign2==2.04
-
bioconda::kalign2==2.04
-
pip
:
-
pip
:
-
deepspeed==0.5.10
# can this be updated?
-
deepspeed==0.5.10
-
dm-tree==0.1.6
# 0.1.6 yanked from conda-forge - update?
-
dm-tree==0.1.6
-
ml-collections==0.1.0
# 0.1.1 is oldest available on conda-forge - update?
-
git+https://github.com/NVIDIA/dllogger.git
-
git+https://github.com/NVIDIA/dllogger.git
-
git+https://github.com/Dao-AILab/flash-attention.git@5b838a8
-
git+https://github.com/Dao-AILab/flash-attention.git=0.2.1
# - git+https://github.com/Dao-AILab/flash-attention.git@5b838a8
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment