Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
d95ee1a95da2b138073b5eef0f09ce4de615ba5e
Switch branch/tag
flash-attention
csrc
flash_attn
src
fmha_kernel.h
26 Nov, 2022
1 commit
Speed up compilation by splitting into separate .cu files
· d95ee1a9
Tri Dao
authored
Nov 25, 2022
d95ee1a9
04 Jul, 2022
1 commit
Implement cross attention
· 6c3a8c65
Tri Dao
authored
Jun 30, 2022
6c3a8c65
26 May, 2022
1 commit
Rename, add benchmarking script
· 9dbc491a
Tri Dao
authored
May 26, 2022
9dbc491a
20 May, 2022
1 commit
First release
· 1fcbe6f0
Tri Dao
authored
May 20, 2022
1fcbe6f0