README.md 251 Bytes
Newer Older
root's avatar
init  
root committed
1
2
3
4
5
6
# Block-Sparse Flash-Attention

Tilelang implementation of block-sparse flash-attention kernels. 

The kernels have been used in [Rectified Sparse Attention](https://arxiv.org/abs/2506.04108) and [SeerAttention-R](https://arxiv.org/abs/2506.08889).