Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
f8dccfc9
Commit
f8dccfc9
authored
Aug 14, 2023
by
Tri Dao
Browse files
[CI] Fix MATRIX_CUDA_VERSION check
parent
9c531bdc
Changes
2
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
2 additions
and
2 deletions
+2
-2
.github/workflows/publish.yml
.github/workflows/publish.yml
+1
-1
flash_attn/__init__.py
flash_attn/__init__.py
+1
-1
No files found.
.github/workflows/publish.yml
View file @
f8dccfc9
...
@@ -142,7 +142,7 @@ jobs:
...
@@ -142,7 +142,7 @@ jobs:
export PATH=/usr/local/nvidia/bin:/usr/local/nvidia/lib64:$PATH
export PATH=/usr/local/nvidia/bin:/usr/local/nvidia/lib64:$PATH
export LD_LIBRARY_PATH=/usr/local/nvidia/lib64:/usr/local/cuda/lib64:$LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/usr/local/nvidia/lib64:/usr/local/cuda/lib64:$LD_LIBRARY_PATH
# Currently for this setting the runner goes OOM if we pass --threads 4 to nvcc
# Currently for this setting the runner goes OOM if we pass --threads 4 to nvcc
if [[ ${MATRIX_CUDA_VERSION} == "12
.
1" && ${MATRIX_TORCH_VERSION} == "2.1" ]]; then
if [[ ${MATRIX_CUDA_VERSION} == "121" && ${MATRIX_TORCH_VERSION} == "2.1" ]]; then
export FLASH_ATTENTION_FORCE_SINGLE_THREAD="TRUE"
export FLASH_ATTENTION_FORCE_SINGLE_THREAD="TRUE"
fi
fi
# Limit MAX_JOBS otherwise the github runner goes OOM
# Limit MAX_JOBS otherwise the github runner goes OOM
...
...
flash_attn/__init__.py
View file @
f8dccfc9
__version__
=
"2.0.6.post
1
"
__version__
=
"2.0.6.post
2
"
from
flash_attn.flash_attn_interface
import
flash_attn_func
from
flash_attn.flash_attn_interface
import
flash_attn_func
from
flash_attn.flash_attn_interface
import
flash_attn_kvpacked_func
from
flash_attn.flash_attn_interface
import
flash_attn_kvpacked_func
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment