Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
zhaoyu6
sglang
Repository
9d5fa68b903d295d2b39201d54905c6801f60f7f
Switch branch/tag
sglang
python
sglang
srt
layers
attention
flashattention_backend.py
Find file
Blame
History
Permalink
Use torch.compile to fuse flash attention decode metadata preparation (#6973)
· 9d5fa68b
Lianmin Zheng
authored
Jun 08, 2025
9d5fa68b
flashattention_backend.py
90.6 KB
Edit
Web IDE
Replace flashattention_backend.py
×
Attach a file by drag & drop or
click to upload
Commit message
Replace flashattention_backend.py
Replace file
Cancel
A new branch will be created in your fork and a new merge request will be started.