Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
757058d4
Commit
757058d4
authored
Aug 27, 2023
by
Tri Dao
Browse files
Update Cutlass to v3.2.0
parent
9f42cb6e
Changes
3
Hide whitespace changes
Inline
Side-by-side
Showing
3 changed files
with
4 additions
and
4 deletions
+4
-4
csrc/cutlass
csrc/cutlass
+1
-1
flash_attn/__init__.py
flash_attn/__init__.py
+1
-1
training/Dockerfile
training/Dockerfile
+2
-2
No files found.
cutlass
@
3a8f57a3
Compare
6f474202
...
3a8f57a3
Subproject commit
6f47420213f757831fae65c686aa471749fa8d60
Subproject commit
3a8f57a3c89cfff7aa686e95f13d9ad850f61898
flash_attn/__init__.py
View file @
757058d4
__version__
=
"2.1.
0
"
__version__
=
"2.1.
1
"
from
flash_attn.flash_attn_interface
import
(
flash_attn_func
,
...
...
training/Dockerfile
View file @
757058d4
...
...
@@ -85,11 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
RUN
pip
install
git+https://github.com/mlcommons/logging.git@2.1.0
# Install FlashAttention
RUN
pip
install
flash-attn
==
2.
0.9
RUN
pip
install
flash-attn
==
2.
1.1
# Install CUDA extensions for cross-entropy, fused dense, layer norm
RUN
git clone https://github.com/HazyResearch/flash-attention
\
&&
cd
flash-attention
&&
git checkout v2.1.
0
\
&&
cd
flash-attention
&&
git checkout v2.1.
1
\
&&
cd
csrc/fused_softmax
&&
pip
install
.
&&
cd
../../
\
&&
cd
csrc/rotary
&&
pip
install
.
&&
cd
../../
\
&&
cd
csrc/xentropy
&&
pip
install
.
&&
cd
../../
\
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment