Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
"tests/models/dpr/test_modeling_dpr.py" did not exist on "d2a93991158f15993eba9ab421d82766b892f948"
35d589fa81a68b7cb806982af4fafac0f19d644d
Switch branch/tag
flash-attention
tests
16 Oct, 2022
1 commit
Fix #54: set device for multi-GPU case
· 52fb4b72
Tri Dao
authored
Oct 16, 2022
52fb4b72
14 Oct, 2022
1 commit
Implement attention kernel that splits the batch into two
· 5badfb78
Tri Dao
authored
Oct 13, 2022
5badfb78
05 Oct, 2022
1 commit
Only run backward test for d=128 on A100
· 0c01568d
Tri Dao
authored
Oct 04, 2022
0c01568d
22 Jul, 2022
1 commit
Add tests for numerical error
· 2ed471ec
Tri Dao
authored
Jul 22, 2022
2ed471ec