Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
b910bf14c1baa7e6a4886c1cd07d65e7a61390c0
Switch branch/tag
flash-attention
flash_attn
flash_attn_triton.py
31 Oct, 2022
4 commits
Support arbitrary seqlens (both q & k) in Triton bwd
· b910bf14
Tri Dao
authored
Oct 30, 2022
b910bf14
Support arbitrary seqlen_k in Triton bwd
· dc554693
Tri Dao
authored
Oct 30, 2022
dc554693
Fix Triton fwd to support seqlen not multiples of 128
· d11341fd
Tri Dao
authored
Oct 30, 2022
d11341fd
Implement FlashAttention in Triton
· b0c0db81
Tri Dao
authored
Oct 30, 2022
b0c0db81