Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
52fb4b729be7fc35e49af12910e38d141c66834d
Switch branch/tag
flash-attention
tests
test_flash_attn.py
16 Oct, 2022
1 commit
Fix #54: set device for multi-GPU case
· 52fb4b72
Tri Dao
authored
Oct 16, 2022
52fb4b72
14 Oct, 2022
1 commit
Implement attention kernel that splits the batch into two
· 5badfb78
Tri Dao
authored
Oct 13, 2022
5badfb78
05 Oct, 2022
1 commit
Only run backward test for d=128 on A100
· 0c01568d
Tri Dao
authored
Oct 04, 2022
0c01568d
22 Jul, 2022
1 commit
Add tests for numerical error
· 2ed471ec
Tri Dao
authored
Jul 22, 2022
2ed471ec