Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
"llama/ggml-aarch64.h" did not exist on "768ab4df541275c05eec5ee5db2f89661302610d"
8a2ece89f7bd5d3124a6cae5fd95db5e85f07ee6
Switch branch/tag
flash-attention
csrc
flash_attn
src
fmha_bwd_hdim32.cu
06 Dec, 2022
1 commit
Simplify BOOL_SWITCH macro to fix compiling error on gcc 7
· 8a2ece89
Tri Dao
authored
Dec 06, 2022
8a2ece89
26 Nov, 2022
2 commits
Fix typo in comments
· 9bc63d1e
Tri Dao
authored
Nov 25, 2022
9bc63d1e
Speed up compilation by splitting into separate .cu files
· d95ee1a9
Tri Dao
authored
Nov 25, 2022
d95ee1a9