Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
cb516f855b7e3545bf7426f3b1bf02914afd0f32
Switch branch/tag
flash-attention
hopper
22 Jul, 2024
2 commits
Remove torchlib dependency from cpp files (#1083)
· cb516f85
Cameron Shinn
authored
Jul 22, 2024
cb516f85
remove lambda (#1056)
· ef3e358a
youkaichao
authored
Jul 21, 2024
ef3e358a
15 Jul, 2024
1 commit
[FA3] BF16 forward
· 74b0761f
Tri Dao
authored
Jul 14, 2024
74b0761f
11 Jul, 2024
1 commit
FA3 initial code release
· 7f67966c
Tri Dao
authored
Jul 11, 2024
7f67966c