Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
713ea302d7b3a191d200f9cb7fd174a3872a9b92
Switch branch/tag
flash-attention
flash_attn
05 Aug, 2022
1 commit
Allow headdim 128 in FlashMHA interface
· 713ea302
Tri Dao
authored
Aug 05, 2022
713ea302
04 Jul, 2022
2 commits
Do P * dP (pointwise) in the bwd in fp32 instead of fp16
· a5559a0e
Tri Dao
authored
Jul 03, 2022
a5559a0e
Implement cross attention
· 6c3a8c65
Tri Dao
authored
Jun 30, 2022
6c3a8c65
03 Jul, 2022
1 commit
Add missing __init__.py
· af4a9ce0
Gustaf
authored
Jul 03, 2022
af4a9ce0
02 Jun, 2022
1 commit
Rename src -> flash_attn
· 5a61cb77
Tri Dao
authored
Jun 01, 2022
5a61cb77