Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Repository
"lm_eval/tasks/mlqa/mlqa_vi_es.yaml" did not exist on "5db23e2c1084e4ca6d92e64bcdefbb1b8ba47688"
86862cfd7bab1fe0279d1cdcdf370268792339cb
Switch branch/tag
flash-attention
flash_attn
flash_attn_triton.py
Find file
Blame
History
Permalink
Implement attention bias for Triton version
· 86862cfd
Tri Dao
authored
Nov 04, 2022
86862cfd
flash_attn_triton.py
35.9 KB
Edit
Web IDE
Replace flash_attn_triton.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace flash_attn_triton.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.