Commit a7874518 authored by Tri Dao's avatar Tri Dao
Browse files

Add paper arXiv link

parent d9fff84b
...@@ -3,7 +3,8 @@ This repository provides the official implementation of FlashAttention from the ...@@ -3,7 +3,8 @@ This repository provides the official implementation of FlashAttention from the
following paper. following paper.
**FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness** **FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness**
Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher Ré Tri Dao, Daniel Y. Fu, Stefano Ermon, Atri Rudra, Christopher Ré
Paper: https://arxiv.org/abs/2205.14135
![FlashAttention](assets/flashattn_banner.jpg) ![FlashAttention](assets/flashattn_banner.jpg)
## Alpha release (0.1). ## Alpha release (0.1).
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment