Unverified Commit 1b4552e0 authored by echo840's avatar echo840 Committed by GitHub
Browse files

Update README.md

parent e52f337b
...@@ -69,7 +69,7 @@ pip install -r requirements.txt ...@@ -69,7 +69,7 @@ pip install -r requirements.txt
``` ```
You can download the corresponding version of flash_attention from https://github.com/Dao-AILab/flash-attention/releases/ and use the following code to install: You can download the corresponding version of flash_attention from https://github.com/Dao-AILab/flash-attention/releases/ and use the following code to install:
```python ```python
pip install flash_attn-2.3.5+cu117torch2.0cxx11abiFALSE-cp310-cp310-linux_x86_64.whl --no-build-isolation pip install flash_attn-2.3.5+cu117torch2.0cxx11abiFALSE-cp39-cp39-linux_x86_64.whl --no-build-isolation
``` ```
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment