Commit f0c40b7d authored by Tri Dao's avatar Tri Dao
Browse files

Recommend Nvidia's Pytorch container

parent 3cad2ab3
...@@ -48,6 +48,10 @@ Requirements: ...@@ -48,6 +48,10 @@ Requirements:
- CUDA 11.4 and above. - CUDA 11.4 and above.
- PyTorch 1.12 and above. - PyTorch 1.12 and above.
We recommend the
[Pytorch](https://catalog.ngc.nvidia.com/orgs/nvidia/containers/pytorch)
container from Nvidia, which has all the required tools to install FlashAttention.
To install: To install:
```sh ```sh
pip install flash-attn pip install flash-attn
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment