Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
ece539abd6de8044d7f491654cc4d1a02edc071b
Switch branch/tag
flash-attention
flash_attn
losses
18 Nov, 2022
1 commit
Add __init__.py files to subdirectories for installation
· ece539ab
Tri Dao
authored
Nov 17, 2022
ece539ab
14 Nov, 2022
1 commit
Make nccl operations async in CrossEntropyLossParallel
· 343492ec
Tri Dao
authored
Nov 13, 2022
343492ec
13 Nov, 2022
1 commit
Add fused cross entropy loss
· 7c995381
Tri Dao
authored
Nov 12, 2022
7c995381