Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Repository
7479757191c04cc1d5a029b0b34c5064278c93ef
Switch branch/tag
flash-attention
flash_attn
flash_attn_triton.py
Find file
Blame
History
Permalink
Fix pipelining bug in Triton bwd with bias_type=matrix
· 74797571
Tri Dao
authored
Nov 06, 2022
74797571
flash_attn_triton.py
37.3 KB
Edit
Web IDE
Replace flash_attn_triton.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace flash_attn_triton.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.