Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
343492ec305d474bcf6e45bc05893bbc040fcc30
Switch branch/tag
flash-attention
tests
losses
14 Nov, 2022
1 commit
Make nccl operations async in CrossEntropyLossParallel
· 343492ec
Tri Dao
authored
Nov 13, 2022
343492ec
13 Nov, 2022
1 commit
Add fused cross entropy loss
· 7c995381
Tri Dao
authored
Nov 12, 2022
7c995381