Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Commits
cbb4cf5f4654c8be42ce086f8528ccbb5a786458
Switch branch/tag
flash-attention
flash_attn
fused_softmax.py
24 Oct, 2022
1 commit
Add Megatron attention implementation for benchmarking
· ed553e92
Tri Dao
authored
Oct 23, 2022
ed553e92