Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
"vllm_flash_attn/modules/mlp.py" did not exist on "fa6d1ce44fc2c8f9fe6330b5e98697fdc434e729"
3dda4f76deeee9e17ad03e8608d33e7a5fa714bf
Switch branch/tag
flash-attention
flash_attn
losses
13 Nov, 2022
1 commit
Add fused cross entropy loss
· 7c995381
Tri Dao
authored
Nov 12, 2022
7c995381