Commit 7f31e7c1 authored by Tri Dao's avatar Tri Dao
Browse files

Bump to v2.3.2

parent 5a834254
...@@ -31,7 +31,7 @@ Please cite and credit FlashAttention if you use it. ...@@ -31,7 +31,7 @@ Please cite and credit FlashAttention if you use it.
Requirements: Requirements:
- CUDA 11.6 and above. - CUDA 11.6 and above.
- PyTorch 1.12 and above. - PyTorch 1.12 and above.
- Linux. Windows is not supported for now. If you have ideas on how to modify the code to support Windows, please reach out via Github issue. - Linux. Might work for Windows starting v2.3.2 (we've seen a few positive [reports](https://github.com/Dao-AILab/flash-attention/issues/595)) but Windows compilation still requires more testing. If you have ideas on how to set up prebuilt CUDA wheels for Windows, please reach out via Github issue.
We recommend the We recommend the
[Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch) [Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch)
......
__version__ = "2.3.1.post1" __version__ = "2.3.2"
from flash_attn.flash_attn_interface import ( from flash_attn.flash_attn_interface import (
flash_attn_func, flash_attn_func,
......
...@@ -85,11 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr ...@@ -85,11 +85,11 @@ RUN pip install transformers==4.25.1 datasets==2.8.0 pytorch-lightning==1.8.6 tr
RUN pip install git+https://github.com/mlcommons/logging.git@2.1.0 RUN pip install git+https://github.com/mlcommons/logging.git@2.1.0
# Install FlashAttention # Install FlashAttention
RUN pip install flash-attn==2.3.1.post1 RUN pip install flash-attn==2.3.2
# Install CUDA extensions for fused dense, layer norm # Install CUDA extensions for fused dense, layer norm
RUN git clone https://github.com/HazyResearch/flash-attention \ RUN git clone https://github.com/HazyResearch/flash-attention \
&& cd flash-attention && git checkout v2.3.1.post1 \ && cd flash-attention && git checkout v2.3.2 \
&& cd csrc/layer_norm && pip install . && cd ../../ \ && cd csrc/layer_norm && pip install . && cd ../../ \
&& cd csrc/fused_dense_lib && pip install . && cd ../../ \ && cd csrc/fused_dense_lib && pip install . && cd ../../ \
&& cd .. && rm -rf flash-attention && cd .. && rm -rf flash-attention
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment