Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
gaoqiong
flash-attention
Repository
03bf1f8a760280f2d99f28e98cada170b3f822c4
Switch branch/tag
flash-attention
vllm_flash_attn
flash_attn_interface.py
Find file
Blame
History
Permalink
Don't use kwargs in autograd functions (#3)
· 03bf1f8a
Antoni Baum
authored
May 27, 2024
03bf1f8a
flash_attn_interface.py
44.8 KB
Edit
Web IDE
Replace flash_attn_interface.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace flash_attn_interface.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.