Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
OpenDAS
AutoAWQ
Repository
92a403b29ffb23848564d45354912f91d20c7f38
Switch branch/tag
autoawq
awq
modules
fused
attn.py
Find file
Blame
History
Permalink
[`core` / `attention`] Fix fused attention generation with newest transformers version (#146)
· 92a403b2
Younes Belkada
authored
Nov 04, 2023
Co-authored-by:
Casper
<
casperbh.96@gmail.com
>
92a403b2
attn.py
9.86 KB
Edit
Web IDE
Replace attn.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace attn.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.